[R] fit a nonlinear model using nlm()

Ken Knoblauch knoblauch at lyon.inserm.fr
Tue Jul 17 13:11:02 CEST 2007

William Simpson <william.a.simpson <at> gmail.com> writes:

> I am trying to fit a nonlinear model using nlm().
> The observer is trying to detect a signal corrupted by noise.
> On each trial, the observer gets stim=signal+rnorm().
> In the simulation below I have 500 trials. Each row of stim is a new trial.
> On each trial, if the cross-correlation between the stim and
> the signal is above some criterion level (crit=.5 here), the
> observer says "signal" (resp=1), else he says "no signal"
> (resp=0).
> Thanks very much for any help!
> Bill
It sounds like you are doing a classification image experiment.  
You can use tapply() to get means for each x as a function of the 
observer's classifications and then combine them as a function of 
hits, false alarms, misses, correct rejections using the weights
 1 -1, -1, 1, as in Ahumada's original approach.
You can do this with lm() if you set it up so that the noise is 
the response and the classifications are a 4 level factor that prediccts 
them and the contrasts are set up as above.  
I think that it would be better to set it up as a glm, however,
where the responses of the observer are binary responses, the 
noise and the presence/absence of the signal are predictor variables, 
with a binomial family. 

I have example code that does each of these if you are 
interested and if it is only for simulation, see 

   Author="Thomas, James P and Knoblauch, Kenneth",
   Title="{{F}requency and phase contributions to the detection of 
   temporal luminance modulation}",
   Journal="J Opt Soc Am A Opt Image Sci Vis",

for which I can send you the code also.


More information about the R-help mailing list