[R] not allocate of vactor size

jim holtman jholtman at gmail.com
Tue Nov 24 03:07:49 CET 2015


​I am not sure how the 2GB file might expand when you read it into R.  I
would suggest that you take a portion, e.g., 500MB, and read it in and see
how large the resulting object is in R.  Continue and process this smaller
size to see how memory utilization changes.  This will provide information
as to how much memory you might need.

If you fail reading in a large file, start cutting it half to see what you
can get in and process.  Do you need all the columns that might be in the
data, or can you remove some?  Can you sample the data and perform the
calculation on that subset to get a rational answer.  Can you put the data
in a database (SQLite) and then pull in selected records more easily.
There are several other alternatives you might want to consider, but at
least find out what you can process on you available system and then you
might know what some of the options are.  Get access to a server with
32-64-128GB to see if you can process the data the way you want if you had
a larger system.


Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Mon, Nov 23, 2015 at 2:26 PM, Tamsila Parveen <tamsilap at yahoo.com> wrote:

> I already tried on windows to increase virtual memory, but it didn't work.
> Actually I want to analyze MD-Simulation trajectories through R and trying
> to generate cross correlation, PCA plotting graphs, but R didn't accept
> file of 2GB. Even though my system is of 4GB RAM.
>
>
>
> On Monday, 23 November 2015, 4:53, jim holtman <jholtman at gmail.com> wrote:
>
>
> My general rule of thumb is that I should have 3-4 times as much RAM as
> the largest object that I am working with.  So hopefully you have at least
> 4 GB of RAM on your system.  Also exactly what processing (packages,
> functions, algorithms, etc.) are you using.  So functions may create
> multiple copies, or they may create temporary objects bigger than the
> original.  So help us out and provide more information.  You might be able
> to add virtual memory, but this may slow down your process quite a bit with
> paging.  If you do go this direction, then learn how to use the performance
> monitoring tools on your system to see what is happening.​
>
>
> Jim Holtman
> Data Munger Guru
>
> What is the problem that you are trying to solve?
> Tell me what you want to do, not how you want to do it.
>
> On Sun, Nov 22, 2015 at 10:08 AM, Tamsila Parveen via R-help <
> r-help at r-project.org> wrote:
>
> Hello,           Is there anyone to help me out how can I resolve memory
> issue of R, when I want to analyze data of 1Gb file, R returns me Error:
> not allocate of vector size of 1.8 GB.I tried on linux as well as on
> windows with 64 bit system and using 64 bit R-3.2.2 version. So anyone who
> knows please guide me to resolve this issue
>         [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> <http://www.r-project.org/posting-guide.html>
> and provide commented, minimal, self-contained, reproducible code.
>
>
>
>
>

	[[alternative HTML version deleted]]



More information about the R-help mailing list