[R] glmmPQL() and memory limitations

Elliott Moreton moreton at vonneumann.cog.jhu.edu
Mon Aug 18 15:51:09 CEST 2003


Hi, all,

When running glmmPQL(), I keep getting errors like

	Error: cannot allocate vector of size 61965 Kb
	Execution halted

This is R-1.7.1.  The data set consists of about 11,000 binary responses 
from 16 subjects.  The model is

	fixed =
	SonResp ~ (ordered (Stop) + ordered (Son)) * StopResp,
	
	random = 
	~ 1 + (ordered (Stop) + ordered (Son)) * StopResp | Subj
	
	family = binomial (link = logit)

SonResp and StopResp are binary; Stop and Son are ordered factors with six 
levels each.  

The machine I'm running this on is my university's scientific server, a 
Beowulf Linux cluster; the machine this job would be running on would have 
two 1.4 GHz CPUS, a 2-gigabyte RAM, and an 18-gigabyte hard disk, plus 130 
gigabytes of scratch file space; it would be running Red Hat Linux 7.2 
with XFS.  

Can anyone tell me whether this is (1) a problem with the model (no
machine could fit it in the lifetime of the universe), (2) a problem with
how I formulated the model (there's a way to get the same end result
without overflowing memory), (c) a problem with glmmPQL() (that could be 
fixed by using some other package), (d) a problem with the machine I'm 
running it on (need more real or virtual memory), or (e) other?  
(Naturally, I've contacted the system administrators to ask them the same 
thing, but I don't know how much they know about R.)

Many thanks in advance,
Elliott Moreton




More information about the R-help mailing list