[R] Memory error

Andrew Perrin clists at perrin.socsci.unc.edu
Thu Mar 8 15:37:40 CET 2007


Greetings-

Running R 2.4.0 under Debian Linux, I am getting a memory error trying to 
read a very large file:

> library(foreign)
> oldgrades.df <- read.spss('Individual grades with AI (Nov 7 2006).sav',to.data.frame=TRUE)
Error: cannot allocate vector of size 10826 Kb


This file is, granted, quite large:

aperrin at perrin:/data0/grading$ ls -l
total 630304
-r-xr-xr-x 1 aperrin aperrin 271210015 2007-03-06 15:54 Individual grades with AI (Mar 2 2007).sav
-r-xr-xr-x 1 aperrin aperrin 353209140 2007-03-06 15:57 Individual grades with AI (Nov 7 2006).sav


...but there ought to be plenty of resources. The machine is a dual-Xeon 
2.8Ghz with 6GB of RAM and enormous swap. It's doing almost nothing else 
when I try the load, and at the moment it returned the error, this was the 
status of top:

Mem:   6750980k total,  4668388k used,  2082592k free,   141820k buffers
Swap: 19535032k total,        8k used, 19535024k free,   749244k cached

   PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
  6168 aperrin   25   0 3015m 2.9g 2880 R  100 45.6   7:52.93 R



Background info:
aperrin at perrin:~$ uname -a
Linux perrin 2.6.18 #1 SMP Tue Feb 6 14:20:44 EST 2007 i686 GNU/Linux

aperrin at perrin:~$ R --version
R version 2.4.0 Patched (2006-11-25 r39997)
Copyright (C) 2006 The R Foundation for Statistical Computing
ISBN 3-900051-07-0



Any thoughts? I would be happy to compile R locally if that would help.

Andy


----------------------------------------------------------------------
Andrew J Perrin - andrew_perrin (at) unc.edu - http://perrin.socsci.unc.edu
Assistant Professor of Sociology; Book Review Editor, _Social Forces_
University of North Carolina - CB#3210, Chapel Hill, NC 27599-3210 USA
New Book: http://www.press.uchicago.edu/cgi-bin/hfs.cgi/00/178592.ctl



More information about the R-help mailing list