[R] Big matrix memory problem

Spencer Graves spencer.graves at pdf.com
Sat May 14 01:57:40 CEST 2005


	  S-Plus 7 advertises facilities for large data sets 
(http://www.insightful.com/products/splus/default.asp#largedata).  Their 
web site says they do this with "New Pipeline Architecture" that 
"streams large data sets through available RAM instead of reading the 
entire data set into memory at once."  It also "includes a new data type 
for dealing with very large data objects".  If you want more than this, 
I suggest you post to "S-News List <s-news at lists.biostat.wustl.edu>";  I 
haven't used it.

	  hope this helps.
	  spencer graves

Gabor Grothendieck wrote:
> On 5/13/05, s104100026 <n.d.fitzgerald at mars.ucc.ie> wrote:
> 
>>Hi All,
>>
>>I want to read 256 1000x1000 matrices into R. I understand that it is unlikely
>>that I can do this but In the hope that somebody can help me I am mailing this
>>list.
>>
>>I have tried increasing my memory size (I understand that it is the minimum of
>>1024 or the computers RAM in my case 512)
>>
>>Does anyone think this is possible in R, could it be tried in Splus for
>>example.
>>
> 
> 
> If they are sparse you could try the SparseM package.
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html




More information about the R-help mailing list