[R] Memory usage problem while using nlm function
kushal.shah at arisglobal.com
Wed Dec 31 12:58:40 CET 2014
I am trying to do nonlinear minimization using nlm() function, but for
large amount of data it is going out of memory.
Code which i am using:
sum(-log((p * dnbinom(n11, size=p, prob=p/(p+E)) +
(1-p) * dnbinom(n11, size=p, prob=p/(p+E)))))
p_out <-nlm(f, p=c(alpha1= 0.2, beta1= 0.06, alpha2=1.4, beta2=1.8, w=0.1),
When the size of n11_c or E_c vector is to large, it is going out of memory.
please give me some solution for this.
View this message in context: http://r.789695.n4.nabble.com/Memory-usage-problem-while-using-nlm-function-tp4701241.html
Sent from the R help mailing list archive at Nabble.com.
More information about the R-help