[R] How to fit a linear model to data by minimizing the mean absolute percent error?

Jeff Newmiller jdnewmil at dcn.davis.CA.us
Mon Jan 14 16:46:30 CET 2013

It doesn't look like you have read the Posting Guide (see bottom of email). This not a homework help forum. Please use the assistance provided by your educational institution.
Jeff Newmiller                        The     .....       .....  Go Live...
DCN:<jdnewmil at dcn.davis.ca.us>        Basics: ##.#.       ##.#.  Live Go...
                                      Live:   OO#.. Dead: OO#..  Playing
Research Engineer (Solar/Batteries            O.O#.       #.O#.  with
/Software/Embedded Controllers)               .OO#.       .OO#.  rocks...1k
Sent from my phone. Please excuse my brevity.

Andre Cesta <aacesta at yahoo.com> wrote:

>Hi All, I wonder if you can help me with an aparently simple task.  I
>have been searching examples for this without any luck: #Assume
>x<-1:10  #x ranges from 1 to 10.
>y<-x*runif(10)+ 1.5*x  #y is a linear function of x with some error.
>Add uniform error that is scaled to be larger as x values also become
>larger #error is proportional to x size, this should cause
>heterocedasticity. #I know there are many methods to deal with
>heterocedasticity, but in my specific case, I want to use percent
>regression to minimize the mean absolute 
>#percentual error as opposed to regular regression that deals with the
>square of the errors. #Question, how to fit a linear model to minimize
>this error on the data y ~ x above?
>#Please do not use model<-lm(y ~ x....) as this will minimize the
>square of the errors, not the mean absolute percent error Best regards,
>André Cesta
>R-help at r-project.org mailing list
>PLEASE do read the posting guide
>and provide commented, minimal, self-contained, reproducible code.

More information about the R-help mailing list