[R] Improving effeciency - better table()?

Simon Cullen cullens at tcd.ie
Tue Jul 6 18:11:38 CEST 2004

On Tue, 6 Jul 2004 09:02:26 -0400, Liaw, Andy <andy_liaw at merck.com> wrote:

> Since you didn't provide an example of what z.trun and breaks may look  
> like, most people can only guess.  Before asking how code can be made  
> more
> efficient, it might be more helpful to find out where in the code is  
> taking
> time.  Try:
> Rprof()
> obs <- table(cut2(z.trun, cuts=breaks))
> Rprof(NULL)
> summaryRprof()

Thanks, Andy. That helped to clear up some of my confusion. I have now  
eliminated the call to cut2 and table and replaced that with hist, as  
suggested by Roger Peng.

However I had changed much more code than I had initially realised and it  
seems that the other code is having a larger effect. I've attached the  
output of an experiment (a power test with 1000 iterations - code  
included) and it seems that the problem is getting the expected number of  
observations in each cell. I have to integrate the density that I am  
working with in order to do this as it isn't standard.

I know that, firstly, using a for() loop is bad but the problem didn't  
lend itself to vectorisation (I thought). Any help would be appreciated.


Simon Cullen
Room 3030
Dept. Of Economics
Trinity College Dublin

Ph. (608)3477
Email cullens at tcd.ie

More information about the R-help mailing list