[R] loop is going to take 26 hours - needs to be quicker!

Duncan Murdoch murdoch at stats.uwo.ca
Thu Dec 14 14:17:24 CET 2006


On 12/14/2006 7:56 AM, Jenny Barnes wrote:
> Dear R-help,
> 
> I have a loop, which is set to take about 26 hours to run at the rate it's going 
> - this is ridiculous and I really need your help to find a more efficient way of 
> loading up my array gpcc.array:
> 
> #My data is stored in a table format with all the data in one long column 
> #running though every longitute, for every latitude, for every year. The 
> #original data is sotred as gpcc.data2 where dim(gpcc.data2) = [476928,5] where 
> #the 5th column is the data:
> 
> #make the array in the format I need [longitude,latitude,years]
> 
> gpcc.array <- array(NA, c(144,72,46)) 
> 
> n=0
> for(k in 1:46){
> for(j in 1:72){
> for(i in 1:144){
> n <- n+1
> gpcc.array[i,j,k] <- gpcc.data2[n,5]
> print(j)
> }
> }
> }
> 
> So it runs through all the longs for every lat for every year - which is the 
> order the data is running down the column in gpcc.data2 so n increses by 1 each 
> time and each data point is pulled off....
> 
> It needs to be a lot quicker, I'd appreciate any ideas!

I think the loop above is equivalent to

gpcc.array <- array(gpcc.data2[,5], c(144, 72, 46))

which would certainly be a lot quicker.  You should check that the 
values are loaded in the right order (probably on a smaller example!). 
If not, you should change the order of indices when you create the 
array, and use the aperm() function to get them the way you want afterwards.

Duncan Murdoch



More information about the R-help mailing list