[R] memory tops out at 1.84gb on OS X 10.4 machine w/ 5GB ram

David Ruau druau at ukaachen.de
Thu Dec 15 10:40:04 CET 2005


Hi,
I don't know why, but I have a workaround maybe:
You can load sequentially the file. Split the text file in 2 or 3 and 
re-associate the vector/list into r after.
Once I was using a similar technic to write a huge matrix into a txt 
file.

David

On Dec 14, 2005, at 21:47, Ken Termiso wrote:

> Hi all,
>
> Sorry if this is a dumb question, but I am on 10.4 with R2.2, and when
> loading a big text file (~500MB) with scan(file, what=character) I am
> throwing malloc errors that say I am out of memory...I have 5GB on this
> machine, and Activity Monitor tells me R is only up to ~1.84GB both 
> times
> this has happened (running from terminal)...
>
> I am wondering why this is happening when I still have >2GB of free 
> memory
> waiting to be used...?
>
> Any advice would be much obliged,
> Ken
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! 
> http://www.R-project.org/posting-guide.html
>




More information about the R-help mailing list