[R] Coarsening the Resolution of a Dataset

jim holtman jholtman at gmail.com
Fri Aug 1 11:33:19 CEST 2008


If you can reduce the size of your data by averaging, then you could
read in a subset of the rows, average the 6x6 matrices and then write
them out for a second phase of processing.  The 2160x4230 object would
take up 75MB if numeric, which is probably 50%  of your available
memory if you are running on windows with 512MB.  I have 1GB and if I
have nothing else running, I have about 650MB after the OS is loaded
and such.  On a 512MB machine, this might leave about 140MB for R to
use.  So if you can scale it down by averaging the 6x6 subsets, you
will probably be much better off in doing your analysis, assuming you
don't lose too much accuracy.  Do you need it all in memory at the
same time?  Again a database might help if you can process subsets of
the data.

On Fri, Aug 1, 2008 at 5:10 AM, Steve Murray <smurray444 at hotmail.com> wrote:
>
> Hi Jim,
>
> Thanks for your advice. The problem is that I can't lose any of the data - it's a global dataset, where the left-most column = 180 degrees west, and the right-most is 180 degrees east. The top row is the North Pole and the bottom row is the South Pole.
>
> I've got 512MB RAM on the machine I'm using - which has been enough to deal with such datasets before...?
>
> I'm wondering, is there an alternative means of achieving this? Perhaps orientated via the desired output of the 'coarsened' dataset - my calculations suggest that the dataset would need to change from the current 2160 x 4320 dimensions to 360 x 720. Is there any way of doing this based on averages of blocks of rows/columns, for example?
>
> Many thanks again,
>
> Steve
>
>
> _________________________________________________________________
> Find the best and worst places on the planet
> http://clk.atdmt.com/UKM/go/101719807/direct/01/



-- 
Jim Holtman
Cincinnati, OH
+1 513 646 9390

What is the problem you are trying to solve?



More information about the R-help mailing list