[Rd] Max likelihood using GPU

Robert Lowe ral64 at cam.ac.uk
Wed May 18 14:27:15 CEST 2011


Hi Oyvind,

I believe this is possible to implement. There is already some work ongoing in using the GPU in R and they use the CUDA toolkit as the reference you supplied do.

http://brainarray.mbni.med.umich.edu/Brainarray/rgpgpu/

Thanks,
Rob


On 18 May 2011, at 10:07, oyvfos wrote:

> Dear all,
> Probably many of you experience long computation times when estimating large
> number of parameters using maximum likelihood  with functions that reguire
> numerical methods such as integration or root-finding. Maximum likelihood is
> an example of paralellization that could sucessfully utilize GPU. The
> general algorithm is described here:
> http://openlab-mu-internal.web.cern.ch/openlab-mu-internal/03_Documents/4_Presentations/Slides/2010-list/CHEP-Maximum-likelihood-fits-on-GPUs.pdf.
> Is it possible to implement this algorithm in R ? 
> Kind regards, Oyvind Foshaug
> 
> --
> View this message in context: http://r.789695.n4.nabble.com/Max-likelihood-using-GPU-tp3532034p3532034.html
> Sent from the R devel mailing list archive at Nabble.com.
> 
> ______________________________________________
> R-devel at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-devel



More information about the R-devel mailing list