[R] any more direct-search optimization method in R

Patrick Burns pburns at pburns.seanet.com
Tue Feb 28 22:14:06 CET 2006


Given that information, I think a genetic algorithm
should probably do well with your problem.  Standard
derivative-based optimizers are going to get frustrated
and give up.  I can believe that Nelder-Mead could
get confused as well, though I'm not sure that it will.

'genopt' from S Poetry does have box constraints for
the parameters.  I'm not sure what other genetic algorithms
that are in R are like.

Patrick Burns
patrick at burns-stat.com
+44 (0)20 8525 0696
http://www.burns-stat.com
(home of S Poetry and "A Guide for the Unwilling S User")

Weijie Cai wrote:

>Hi All,
>Thanks for all your replies especially for Graves suggestions. You are right 
>I should give more information about my function. So my responds to your 
>questions are:
>1. 2. the function itself is not continuous/smooth. The evaluation at each 
>point is a random number with a non-constant variance. When it approaches 
>the global minimum, the variance is very small. There is some kind of 
>structure from the surface plot of my function but its form is intractable, 
>unfortunately.
>
>3. 4. each evaluation of my function is not slow. The returned results by 
>constrOptim() are just not quite close to true global minimum (error can be 
>as large as 0.2). Of course I can ignore the message of nonconvergence, the 
>precision is really not satisfying. Every time nelder-mead will use up 300 
>default iterations when doing optimization. I guess the essential reason is 
>the randomness of function surface.
>
>5. Yes I am sure there is a global minimum. I did a lengthy computation at 
>rough grids and global minimum is very close to true minimum.
>
>6. Do you mean I start from a "minimum" found by grid searching? That's what 
>I did. I never tried using smooth functions to approximate my function 
>though.
>
>WC
>
>
>  
>
>>From: Spencer Graves <spencer.graves at pdf.com>
>>To: Ingmar Visser <I.Visser at uva.nl>
>>CC: Weijie Cai <wcai11 at hotmail.com>, r-help at stat.math.ethz.ch
>>Subject: Re: [R] any more direct-search optimization method in R
>>Date: Tue, 28 Feb 2006 09:33:35 -0800
>>
>>WC:
>>
>>	  What do you mean by "noisy" in this context?
>>
>>	  1.  You say, "gradient, hessian not available".  Is it continuous with 
>>perhaps discontinuities in the first derivative?
>>
>>	  2.  Or is it something you can compute only to, say, 5 significant 
>>digits, and some numerical optimizers get lost trying to estimate 
>>derivatives from so fine a grid that the gradient and hessian are mostly 
>>noise?
>>
>>	  3.  Also, why do you think "constrOptim" is too slow?  Does it call your 
>>function too many times or does your function take too long to compute each 
>>time it's called?
>>
>>	  4.  What's not satisfactory about the results of "constrOptim"?
>>
>>	  5.  Do you know if only one it has only one local minimum in the region, 
>>or might it have more?
>>
>>	  6.  Regardless of the answers to the above, have you considered using 
>>"expand.grid" to get starting values and narrow the search (with possibly 
>>system.time or proc.time to find out how much time is required for each 
>>function evaluation)?  I haven't tried this, but I would think it would be 
>>possible to fit a spline (either exactly or a smoothing spline) to a set of 
>>points, then optimize the spline.
>>
>>	  hope this helps.
>>	  spencer graves
>>
>>Ingmar Visser wrote:
>>
>>    
>>
>>>If you have only boundary constraints on parameters you can use method
>>>L-BFGS in optim.
>>>Hth, ingmar
>>>
>>>
>>>
>>>      
>>>
>>>>From: Weijie Cai <wcai11 at hotmail.com>
>>>>Date: Tue, 28 Feb 2006 11:48:32 -0500
>>>>To: <r-help at stat.math.ethz.ch>
>>>>Subject: [R] any more direct-search optimization method in R
>>>>
>>>>Hello list,
>>>>
>>>>I am dealing with a noisy function (gradient,hessian not available) with
>>>>simple boundary constraints (x_i>0). I've tried constrOptim() using 
>>>>nelder
>>>>mead to minimize it but it is way too slow and the returned results are 
>>>>not
>>>>satisfying. simulated annealing is so hard to tune and it always crashes 
>>>>R
>>>>program in my case. I wonder if there are any packages or functions can 
>>>>do
>>>>direct search optimization?
>>>>
>>>>A rough search in literature shows multidirectional search and DIRECT
>>>>algorithm may help. Is there any other satisfying algorithm?
>>>>
>>>>Thanks,
>>>>WC
>>>>
>>>>______________________________________________
>>>>R-help at stat.math.ethz.ch mailing list
>>>>https://stat.ethz.ch/mailman/listinfo/r-help
>>>>PLEASE do read the posting guide! 
>>>>http://www.R-project.org/posting-guide.html
>>>>        
>>>>
>>>______________________________________________
>>>R-help at stat.math.ethz.ch mailing list
>>>https://stat.ethz.ch/mailman/listinfo/r-help
>>>PLEASE do read the posting guide! 
>>>http://www.R-project.org/posting-guide.html
>>>      
>>>
>
>______________________________________________
>R-help at stat.math.ethz.ch mailing list
>https://stat.ethz.ch/mailman/listinfo/r-help
>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html
>
>
>
>  
>




More information about the R-help mailing list