[R] Too large a data set to be handled by R?
thjwong at gmail.com
Wed May 20 05:59:09 CEST 2009
Dear R users,
I have been using a dynamic data extraction from raw files strategy at
the moment, but it takes a long long time.
In order to save time, I am planning to generate a data set of size
1500 x 20000 with each data point a 9-digit decimal number, in order
to save my time.
I know R is limited to 2^31-1 and that my data set is not going to
exceed this limit. But my laptop only has 2 Gb and is running 32-bit
Windows / XP or Vista.
I ran into R memory problem issue before. Please let me know your
opinion according to your experience.
Thanks a lot!
More information about the R-help