[R] Issues with nnet.default for regression/classification

Georg Ruß research at georgruss.de
Fri Nov 26 16:51:18 CET 2010


Hi,

I'm currently trying desperately to get the nnet function for training a
neural network (with one hidden layer) to perform a regression task.

So I run it like the following:

trainednet <- nnet(x=traindata, y=trainresponse, size = 30, linout = TRUE, maxit=1000)
(where x is a matrix and y a numerical vector consisting of the target
values for one variable)

To see whether the network learnt anything at all, I checked the network
weights and those have definitely changed. However, when examining the
trainednet$fitted.values, those are all the same so it rather looks as if
the network is doing a classification. I can even set linout=FALSE and
then it outputs "1" (the class?) for each training example. The
trainednet$residuals are correct (difference between predicted/fitted
example and actual response), but rather useless.

The same happens if I run nnet with the formula/data.frame interface, btw.

As per the suggestion in the ?nnet page: "If the response is not a factor,
it is passed on unchanged to 'nnet.default'", I assume that the network is
doing regression since my trainresponse variable is a numerical vector and
_not_ a factor.

I'm currently lost and I can't see that the AMORE/neuralnet packages are
any better (moreover, they don't implement the formula/dataframe/predict
things). I've read the manpages of nnet and predict.nnet a gazillion
times, but I can't really find an answer there. I don't want to do
classification, but regression.

Thanks for any help.

Georg.
--
Research Assistant
Otto-von-Guericke-Universität Magdeburg
research at georgruss.de
http://research.georgruss.de



More information about the R-help mailing list