[R] Error: cannot allocate vector of size 5.2 Gb

Pascal Oettli kridox at ymail.com
Fri Mar 21 01:02:14 CET 2014


Hello,

It is not the right way to read a NetCDF file  (according to the
extension) in R. Please have a look at the "ncdf4" package. The
"raster" package is also able to read this kind of files.

Regards,
Pascal

On Fri, Mar 21, 2014 at 1:25 AM, eliza botto <eliza_botto at hotmail.com> wrote:
> Dear R family,
> I am trying to read a real large dataset in R (~ 2Gb). Its in binary format. When i tried to read it by using following command
> readBin("DAT.dat.nc", numeric(), n=9e8, size=4, signed=TRUE, endian='little')
> I got the following error
> Error: cannot allocate vector of size 5.2 Gb
> I have a Ram of 4Gb. I even tried to allocate more space to it by "memory.limit(size=9000000000000)" but to no use.
> What do i do? Buy a new ram or act smart?
> Thankyou very much in advance
> Eliza
>         [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.



-- 
Pascal Oettli
Project Scientist
JAMSTEC
Yokohama, Japan




More information about the R-help mailing list