[R] Faster Multivariate Normal

Doran, Harold HDoran at air.org
Tue Jun 7 15:06:10 CEST 2016


I am computing a complex multidimensional integral (4 dimensions) using Gauss-legendre and am looking to optimize one piece of code that is currently very expensive. The integral is evaluated for K individuals in my data and once this piece is computed it can be stored and recycled over all K individuals. Once this piece is stored, the integral can be computed in about 1.3 minutes using R for each individual.

Because the integral has multiple dimensions, the number of nodes grows exponentially such that I require q^4 total nodes and experimentation is showing I need no fewer than 40 per dimension. At each of these, I need to compute the density of the multivariate normal at all q^4 nodes and store it. This is very expensive and I wonder if there is a way to improve the computational time to be faster?

Below is just a reproducible toy example (not using legendre nodes) to illustrate the issue. The final line is what I am wondering about to see if it can be improved in terms of computational time.

Thank you
Harold

library(mvtnorm)
### Create parameters for MVN
mu <- c(0,0,0,0)
cov <- matrix(.2, ncol= 4,nrow=4)
diag(cov) <- 1
sigma <- as.matrix(cov)

### Create nodes and expand to 4 dimensions for quadrature
dm  <- length(mu)
gh  <- seq(-4, 4, length.out = 10)
n <- length(gh)
idx <- as.matrix(expand.grid(rep(list(1:n),dm)))

### Compute density, this is expensive
adjFactor <- sapply(1:nrow(idx), function(i) dmvnorm(gh[idx[i,]], mean = mu, sigma = sigma))


	[[alternative HTML version deleted]]



More information about the R-help mailing list