[R] step, leaps, lasso, LSE or what?
murray.frank at commerce.ubc.ca
Fri Mar 1 01:12:05 CET 2002
I am trying to understand the alternative methods that are available for
variables in a regression without simply imposing my own bias (having "good
judgement"). The methods implimented in leaps and step and stepAIC seem to
fall into the general class of stepwise procedures. But these are commonly
condemmed for inducing overfitting.
In Hastie, Tibshirani and Friedman "The Elements of Statistical Learning"
they describe a number of procedures that seem better. The use of
in the training stage presumably helps guard against overfitting. They seem
particularly favorable to shrinkage through ridge regressions, and to the
may not be too surprising, given the authorship. Is the lasso "generally
being a pretty good approach? Has it proved its worth on a variety of
problems? Or is
it at the "interesting idea" stage? What, if anything, would be widely
being sensible -- apart from having "good judgement".
In econometrics there is a school (the "LSE methodology") which argues for
amounts to stepwise regressions combined with repeated tests of the
the error terms. (It is actually a bit more complex than that.) This has
been coded in
the program PCGets:
If anyone knows how this compares in terms of effectiveness to the methods
Hastie et al., I would really be very interested.
Murray Z. Frank
B.I. Ghert Family Foundation Professor
Strategy & Business Economics
Faculty of Commerce
University of British Columbia
Canada V6T 1Z2
e-mail: Murray.Frank at commerce.ubc.ca
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help