[R] Memory issues..

JFRI (Jesper Frickman) jfri at novozymes.com
Wed Nov 12 17:29:17 CET 2003


I have just tried listing limsdata from the workspace and it is indeed
gone from .GlobalEnv. I also tried passing the environment to the
as.formula function, but it still doesn't work.

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460


-----Original Message-----
From: Thomas W Blackwell [mailto:tblackw at umich.edu] 
Sent: Wednesday, November 12, 2003 10:43 AM
To: JFRI (Jesper Frickman)
Cc: rodrigo.abt at sii.cl; jmacdon at umich.edu; r-help at stat.math.ethz.ch
Subject: RE: [R] Memory issues..


Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed Jim on
this email, hoping that he will give more details to the entire list.
What operating systems (and versions of R) do you think Rodrigo and
Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for the
terms in a formula.  (I've never done this, so I can't give you an
example of working code, only references to the help pages for "formula"
and "environment".  It's often very instructive to literally type in the
sequence of statements given as examples at the bottom of each help
page.)  I think this will allow you to avoid assigning to the global
workspace.

Are you sure that the call to  rm() below is actually removing the copy
of limsdata that's in .GlobalEnv, rather than a local copy ? I would
expect you to have to specify  where=1  in order to get the behavior you
want.

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe 
> R has just used up the memory on something else. I think there is a 
> fair amount of memory leak, as I get similar problems with my program.

> I use R 1.8.0. My program goes as follows.
>
> 1. Use RODBC to get a data.frame containing assays to analyze (17 
> assays are found). 2. Define an AnalyzeAssay(assay, suffix) function 
> to do the following:
> 	a) Use RODBC to get data.
> 	b) Store dataset "limsdata" in workspace using the <<- operator
to 
> avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> enclos) : Object "limsdata" not found, when I call it with a grouping 
> formula like: ~ resid(.) | ORDCURV.
> 	c) Call lme to analyze data.
> 	d) Produce some diagnostic plots. Record them by setting
record=TRUE 
> on the trellis.device
> 	e) Save the plots on win.metafile using replayPlot(...)
> 	f) Save text to a file using sink(...)
>
> 3. Call the function for each assay using the code:
>
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
> 	writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
> 	flush.console()
> 	AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
>
> 	# Clean up memory
> 	rm(limsdata)
> 	gc()
> }
>
> As you can see, I try to remove the dataset stored in workspace and 
> then call gc() to clean up my memory as I go.
>
> Nevertheless, when I come to assay 11 out of 17, it stops with a 
> memory allocation error. I have to quit R, and start again with assay 
> 11, then it stops again with assay 15 and finally 17. The last assays 
> have much more data than the first ones, but all assays can be 
> completed as long as I keep restarting...
>
> Maybe restarting the job can help you getting it done?
>
> Cheers,
> Jesper
>
> -----Original Message-----
> From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl]
> Sent: Monday, November 10, 2003 11:02 AM
> To: r-help at stat.math.ethz.ch
> Subject: [R] Memory issues..
>
>
> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R. 
> My sample size is about 2965 and 3 factors:
>
> year (5 levels), ssize (4 levels), condition (2 levels).
>
> When I issue the following command:
>
> >
> lme(var~year*ssize*condition,random=~ssize+condition|subject,data=smp,
> me
> thod
> ="ML")
>
> I got the following error:
>
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
>         Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
>
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb 
> processor. My version of R is 1.7.1.
>
> Thanks in advance,
>
> Rodrigo Abt.
> Department of Economic and Tributary Studies,
> SII, Chile.
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list 
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>




More information about the R-help mailing list