[R] intelligent optimizer (with domain restrictions?)

Ravi Varadhan rvaradhan at jhmi.edu
Wed Mar 25 23:27:10 CET 2009


Hi,

Without knowing much about the problem, it is difficult to provide good advice.  Having said that, it seems like you are trying to solve a system of nonlinear equations by matching theoretical moments to their empirical counterparts.   You can do this by using a nonlinear equations solver such as dfsane() in the the package "BB" or nleqslv() in "nleqslv". 

It is not clear to me how you end up with a scalar objective function to minimize (do you consider the L2-norm of the residuals?).

Ravi.

____________________________________________________________________

Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology
School of Medicine
Johns Hopkins University

Ph. (410) 502-2619
email: rvaradhan at jhmi.edu


----- Original Message -----
From: ivowel at gmail.com
Date: Wednesday, March 25, 2009 6:16 pm
Subject: [R] intelligent optimizer (with domain restrictions?)
To: r-help <r-help at stat.math.ethz.ch>


> dear R experts---sorry, second question of the day. I want to match 
> some  
>  moments. I am writing my own code---I have exactly as many moment  
>  conditions as parameters, and I am leary of having to learn the magic 
> of  
>  GMM weighting matrices (if I was to introduce more). the process 
> sounds  
>  easy conceptually. (Seen it in seminars many times, so how hard could 
> it  
>  possibly be?...me thinks) first time I am trying this. some of my 
> moments  
>  are standard deviations. Easy, me thinks. Just maximize the  
>  exp(my.sigma.parameter) instead of the my.sigma.parameter. This way, 
> nlm()  
>  can throw negative values into my objective function, and I will be 
> good.  
>  this is about the time to start laughing, of course.
>  
>  so, nlm() computes a gradient that is huge at my initial starting 
> value. it  
>  then decides that it wants to take a step into exp(20.59), at which 
> point  
>  everything in my function goes heywire and it wants to return NA. now 
> nlm()  
>  barfs...and I am seriously consider grid-searching. This does not 
> strike me  
>  as particular intelligent.
>  
>  are there any intelligent optimizers that understand domains and/or  
> 
>  will "backstep" gracefully when they encounter an NA? are there 
> better ways  
>  to deal with matching second moments?
>  
>  advice appreciated.
>  
>  regards,
>  
>  /iaw
>  
>  PS: you probably don't want to know this, but I have a dynamic panel 
> data  
>  set; and my goal is to test whether a constant auto-coefficient 
> across  
>  units can describe the data. that is, I want to find out whether 
> x(i,t)= a  
>  + b(i) + c*x(i,t-1) is better replaced by x(i,t)=a + b(i) + 
> c(i)*x(i,t-1).  
>  right now, I am running N OLS TS regression of x on lagged x, and am  
> 
>  picking off the mean(c), sd(c), and mean(sigma_i) and sd(sigma_i). if 
> there  
>  is a procedure in R that already does a test for heterogeneous  
>  autocorrelation coefficients in a more intelligent fashion, please 
> please  
>  point me to it. however, even if this exists, I think I need to 
> figure out  
>  how to find a more graceful optimizer anyway.
>  
>  	[[alternative HTML version deleted]]
>  
>  ______________________________________________
>  R-help at r-project.org mailing list
>  
>  PLEASE do read the posting guide 
>  and provide commented, minimal, self-contained, reproducible code.




More information about the R-help mailing list