[R] problem in reading large files

Duncan Murdoch murdoch at stats.uwo.ca
Sat Aug 12 04:08:08 CEST 2006


On 8/11/2006 8:50 PM, T Mu wrote:
> I was trying to read a large .csv file (80 colums, 400,000 rows, size of
> about 200MB). I used scan(), R 2.3.1 on Windows XP. My computer is AMD 2000+
> and has 512MB ram.

You should get R-patched; there were some bugs with low memory handling 
fixed recently:

 From CHANGES:

R could crash when very low on memory. (PR#8981)

You should also get more physical memory.  512MB is not much for 
handling a 200MB of data.  You can fairly easily benefit from increasing 
up to 2 GB, and will benefit (with some work) if you have even more, up 
to 4 GB.

Duncan Murdoch

> 
> It sometimes freezes my PC, sometimes just shuts down R quitely.
> 
> Is there a way (option, function) to better handle large files?
> 
> Seemingly SAS can deal with it with no problem, but I just persuaded my
> professor transfering to R, so it is quite disappointing.
> 
> Please help, thank you.
> 
> 	[[alternative HTML version deleted]]
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



More information about the R-help mailing list