[R] memory problem on Suse

Ambrosi Alessandro ambrosi.alessandro at hsr.it
Wed Dec 16 14:02:54 CET 2009


Dear all and dear Marc, it seems you hit the target.
I checked as you suggested, and... it is a 32 bit version! 
Now I'm  fixing it. Thank you very much.
Alessandro

________________________________________
From: Marc Schwartz [marc_schwartz at me.com]
Sent: 11 December 2009 17:02
To: Ambrosi Alessandro
Cc: r-help at r-project.org
Subject: Re: [R] memory problem on Suse

On Dec 11, 2009, at 6:24 AM, Ambrosi Alessandro wrote:

>
> Dear all, I am meeting some problems with  memory allocation. I know
> it is an old issue, I'm sorry.
> I looked for a solution in the FAQs and manuals, mails, but without
> finding the working answer.
> I really hope you can help me.
> For instance, if I try to read micorarray data I get:
>
>> mab=ReadAffy(cdfname="hgu133plus2cdf")
> Error: cannot allocate vector of size 858.0 Mb
>>
>
> I get similar errors with smaller objects, smaller data sets or
> other procedures
> ("Error: cannot allocate vector of size 123.0 Mb").
> I'm running R with Suse 11.1 Linux OS, on two Xeon processors (8
> cores), 32 GB RAM.
> I suppose I have enough resources to manage these objects and data
> files....
>
> Any suggestions or hints will be really appreciated!
> Many thanks in advance.
> Alessandro

Well, you are running into a situation where there is not a contiguous
chunk of RAM available in the sizes referenced, for allocation to the
vector.

Presuming that you are running a 64 bit version of SUSE (what does
'uname -a' show in a system console), you should also check to be sure
that you are also running a 64 bit version of R. What does:

   .Machine$sizeof.pointer

show?

If it returns 4, then you are running a 32 bit version of R, which
cannot take advantage of your 64 bit platform. You should install a 64
bit version of R.

HTH,

Marc Schwartz



More information about the R-help mailing list