[R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

John jwd @end|ng |rom @urewe@t@net
Thu Sep 3 03:32:42 CEST 2020


On Wed, 2 Sep 2020 16:31:53 -0500
David Jones <david.tn.jones using gmail.com> wrote:

> Thank you Uwe, John, and Bert - this is very helpful context.
> 
> If it helps inform the discussion, to address John and Bert's
> questions - I actually had less memory free when I originally ran the
> analyses and saved the workspace, than when I read in the data back in
> later on (I rebooted in an attempt to free all possible memory before
> rereading the workspace back in).
> 
I assumed that, though I shouldn't have.  Nice to know.  Were you
working from a terminal or through a GUI like RStudio?  You will need
to provide a really clear description of the initial and later
conditions.  Your step to reboot and then load is exactly what I would
have done, I would also have killed any network connection temporarily
to see if there are other things going on that caused the problem out
side of R.  You should also let any potential helper know what OS you
are using, and what hardware configuration you have.  Since you
rebooted you are probably not working across a network, but ...

JWDougherty



More information about the R-help mailing list