[R] increasing memory

Janet Rosenbaum jerosenb at hcs.harvard.edu
Wed May 5 05:39:44 CEST 2004


 
> If it actually crashes there is a bug, but I suspect that it stops with an
> error message -- please do read the posting guide and tell us exactly what
> happens.

Sorry, I hadn't realized that "crash" means to give an error message on
this mailing list.  

To me, "crash" means that the computer freezes entirely, or if I'm
lucky it just runs for several hours without doing anything, and the 
process can't even be killed with  -9, and the computer can't be
shutdown, but has to be powercycled. 

For instance, I left it doing a read.table on a text format file from this 
data (a few hundred megs) and eight hours later it was still "going".
I watched the process with "top" for awhile and the computer had plenty 
of free memory -- over 100 M this whole time, and R was using almost no
CPU.

I have tried all sorts of ways of reading in the data.  It's best if I
can read the xport file since that has all the original labels which
don't get to the text file, but read.xport actually freezes the
computer.  

As I said, I am running R 1.8.1 which claims to be the most recent
version (when I type is.RAqua.updated()) on an ibook G3/800 with 620 M
RAM (the maximum) running 10.3.3.  

The command really doesn't much matter.  These are totally normal files
and I can load in the normal sized files with the exact same
commands.  
> w<-read.table("pedagogue.csv",header=T, sep=",")
> library(foreign)
> w<-read.xport("demagogue.xpt")

The xpt files are up to 400 M, and the csv files are about 100 M.  

Janet
-- 
Janet Rosenbaum					 jerosenb at fas.harvard.edu
Harvard Injury Control Research Center,   Harvard School of Public Health




More information about the R-help mailing list