[R] Running out of memory when importing SPSS files
dobomode at gmail.com
Thu Feb 19 05:39:02 CET 2009
I found the culprit. I had a number of variables in the SPSS file that
were a variable length string data type (255 characters). This seemed
to force R into creating 255-byte variables which eventually choked my
On Feb 18, 5:34 pm, Uwe Ligges <lig... at statistik.tu-dortmund.de>
> > Hello R-help,
> > I am trying to import a large dataset from SPSS into R. The SPSS file
> > is in .SAV format and is about 1GB in size. I use read.spss to import
> > the file and get an error saying that I have run out of memory. I am
> > on a MAC OS X 10.5 system with 4GB of RAM. Monitoring the R process
> > tells me that R runs out of memory when reaching about 3GB of RAM so I
> > suppose the remaining 1GB is used up by the OS.
> > Why would a 1GB SPSS file take up more than 3GB of memory in R?
> Because SPSS stores data in a compressed way?
> > Is it
> > perhaps because R is converting each SPSS column to a less memory-
> > efficient data type? In general, what is the best strategy to load
> > large datasets in R?
> Use a 64-bit version of R and have sufficient amount of RAM in your system.
> Uwe Ligges
> > Thanks!
> > P.S.
> > I exported the SPSS .SAV file to .CSV and tried importing the comma
> > delimited file. Same results – the import was much slower but
> > eventually I ran out of memory again...
> > ______________________________________________
> > R-h... at r-project.org mailing list
> > PLEASE do read the posting guidehttp://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
> R-h... at r-project.org mailing listhttps://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guidehttp://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help