[R] Boosting, bagging and bumping. Questions about R tools and predictions.

Ko-Kang Kevin Wang kwan022 at stat.auckland.ac.nz
Wed Jul 23 02:22:59 CEST 2003


Hi,

If you want to learn the theory of boosting and bagging, and other 
classification techniques, then you will want to refer to Hastie, 
Tibshirani and Friedman's "The Element of Statistical Learning: Data 
Mining, Inference, and Prediction" by Springer.  It is the best book I 
have seen in these areas.

To apply them (or some of the techniques) in R, the book you want to look 
at is Venables and Ripley's MASS 4 (Modern Applied Statistics with S).

On Tue, 22 Jul 2003 monkeychump at hushmail.com wrote:

> Date: Tue, 22 Jul 2003 17:09:47 -0700
> From: monkeychump at hushmail.com
> To: r-help at stat.math.ethz.ch
> Subject: [R] Boosting,
>      bagging and bumping. Questions about R tools and predictions.
> 
> 
> I'm interested in further understanding the differences in using many
> classification trees to improve classification rates. I'm also interested
> in finding out what I can do in R and which methods will allow prediction.
> Can anybody point me to a citation or discussion?

-- 
Cheers,

Kevin

------------------------------------------------------------------------------
"On two occasions, I have been asked [by members of Parliament],
'Pray, Mr. Babbage, if you put into the machine wrong figures, will
the right answers come out?' I am not able to rightly apprehend the
kind of confusion of ideas that could provoke such a question."

-- Charles Babbage (1791-1871) 
---- From Computer Stupidities: http://rinkworks.com/stupid/

--
Ko-Kang Kevin Wang
Master of Science (MSc) Student
SLC Tutor and Lab Demonstrator
Department of Statistics
University of Auckland
New Zealand
Homepage: http://www.stat.auckland.ac.nz/~kwan022
Ph: 373-7599
    x88475 (City)
    x88480 (Tamaki)




More information about the R-help mailing list