[R] nls making R "not responding"

Schatzi adele_thompson at cargill.com
Fri Oct 21 19:31:39 CEST 2011


Here is the code I am running:
library(nls2)

modeltest<- function(A,mu,l,b,thour){
out<-vector(length=length(thour))
for (i in 1:length(thour)) {
out[i]<-b+A/(1+exp(4*mu/A*(l-thour[i])+2))
}
return(out)
}

A=1.3
mu=.22
l = 15
b = .07

thour = 1:25
Yvals<-modeltest(A,mu,l,b,thour)-.125+runif(25)/4
st2 <- expand.grid(A = seq(0.1, 1.6,.5), mu = seq(0.01, .41,.1), l=1, b
=seq(0,.6,.3))
lower.bound<-list(A=.01,mu=0,l=0,b=0)

try(
invisible(capture.output(mod1<-nls2(Yvals~modeltest(A,mu,l,b,thour),
#start = list(b =5, k = 2, l=0),
start = st2,
lower = lower.bound,
algorithm = "brute-force"
)))
)

try(nmodel<-nls(Yvals~modeltest(A,mu,l,b,thour),
start=coef(mod1),
#start=list(A=1.8,mu=.2,l=.5,b=.15),
lower=lower.bound,
algorithm="port")
)

My problem seems to be with initial parameter estimates. I am running
through a couple hundred treatments, so I used nls2 to pick the best
starting values from some options, then I run through nls with those values.
If I have too many options (st2) in nls2, the run takes too long. When I cut
down options, there are either errors, or in some cases R stops responding
completely and I have to shut it down and start over. I do not know why it
shows the "not responding". Is there a better way (well, I'm sure there's
always a better way) to do this where I can run through 200+ datasets with
robust enough starting values. 

Any ideas would be greatly appreciated.

Adele

-----
In theory, practice and theory are the same. In practice, they are not - Albert Einstein
--
View this message in context: http://r.789695.n4.nabble.com/nls-making-R-not-responding-tp3926263p3926263.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list