[R] PDF with computationally expensive normalizing constant

Robin Hankin r.hankin at noc.soton.ac.uk
Mon Feb 11 12:57:16 CET 2008


Hi

I am writing some functionality for a multivariate  PDF.

One problem is that evaluating the normalizing constant (NC)   is
massively computationally intensive [one recent example
took 4 hours and bigger examples would take much much longer]
and it would be good allow for this in the
design of the package somehow.

For example, the likelihood function doesn't need the NC
but (eg) the moment generating function does.

So a user wanting a maximum-likelihood estimate shouldn't have
to evaluate the NC but a user wanting a
mean has to.  Some simple forms of the PDF have an
easily-evaluated analytical expression for the NC.

And once the NC is evaluated, it would be
good to store it somehow.

I thought perhaps I could define an S4 class  with a slot for
the parameters and a slot for the NC; and
if the NC is unknown this would have an "NA" entry.

Then a user could execute something like

a <- CalculateNormalizingConstant(a)

and after this, object "a"  would then have the numerically
computed NC  in place.



Is this a Good Idea?

Are there any PDFs implemented in R  in which this is an issue?






--
Robin Hankin
Uncertainty Analyst and Neutral Theorist,
National Oceanography Centre, Southampton
European Way, Southampton SO14 3ZH, UK
  tel  023-8059-7743



More information about the R-help mailing list