[R] Discarding Models in Caret During Model Training
lorenzo.isella at gmail.com
Mon Nov 14 12:07:20 CET 2016
Maybe some of you has come across this problem.
Let's say that you use caret for hyperparameter tuning.
You train several models and you then select the best performing one
according to some performance metric.
My problem is that, sometimes, I would like to tune really many models
(in the order of hundreds of them). Time is not a problem, but I run
out of memory.
My question is: for any model, its performance is calculated while it
is running. I am only interested in the best performing model (or, to
keep it large, let's say in the 5 best performing models).
Would it be possible to script something that ranks the models while
they are being generated and automatically updates the list of the
best performing 5 models and deletes all the others (for which,
frankly speaking, I have no use).
Is there a flaw in my idea? That would not save me time, but a lot of
memory for sure.
Any suggestion is helpful.
More information about the R-help