[R] ?summaryRprof running at 100% cpu for one hour ...

Mike Marchywka marchywka at hotmail.com
Tue Nov 23 00:17:30 CET 2010









----------------------------------------
> Date: Mon, 22 Nov 2010 19:59:04 -0300
> Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
> From: kjetilbrinchmannhalvorsen at gmail.com
> To: marchywka at hotmail.com
> CC: ligges at statistik.tu-dortmund.de; r-help at r-project.org
>
> see below.
>
> On Mon, Nov 22, 2010 at 12:57 PM, Mike Marchywka wrote:
> >
> >
> >
> >
> >
> >
> >
> >
> > ----------------------------------------
> >> Date: Mon, 22 Nov 2010 12:41:06 -0300
> >> Subject: Re: [R] ?summaryRprof running at 100% cpu for one hour ...
> >> From: kjetilbrinchmannhalvorsen at gmail.com
> >> To: marchywka at hotmail.com
> >> CC: ligges at statistik.tu-dortmund.de; r-help at r-project.org
> >>
> >> see below.
> >>
> >> On Mon, Nov 22, 2010 at 12:13 PM, Mike Marchywka wrote:
> >> >
> >> >
> >> Thanks. Will try. Really, I tried yesterday, to run R under gdb within
> >> emacs, but it did'nt work out. What I did (in emacs 23) was, typing
> >> Ctrl-u M-x R
> >> and then enter the option
> >> --debugger=gdb
> >>
[[elided Hotmail spam]]
> >>
> >> Kjetil
> >
> > I rarely use gdb but it did seem to work with R but I executed gdb from
> > cygwin windoh and IIRC ctrl-C worked fine as it broke into debugger.
> > I guess you could try that- start gdb and attach or invoke R from gdb.
> >
> >
>
> OK, thanks. I started R with
> R --debugger=gdb
> in a shell, outside emacs. then it works.
>
> I did some unsystematic sampling with Ctrl-C. Most of the time it was stuck
> in memory.c, apparently doing garbage collection.
> Other files which occured was unique.c, duplicate.c
>

you may want to try the R-develop list for better help now but
presumably you can get symobls somewhere and a readable
stack trace. I guess floundering with memory management
would be consistent with high CPU usage since as far as the OS
is concerned the process is runnable. In java you see stuff like
this with lots of temp objects being created. I guess if it
is gc and you make lots of garbage and then need a big contiguous
area could slow things down a lot.
Once you are pretty sure you stopped it in a hotspot, you can
try stepping in and out of things and see if anything looks odd.

I guess one other exploratory thing to try, this may or may not
work in R with your problem, is get a snapshot of the memory and then use a utility
like "strings" to see if there is any indication of what is going on.
If objects are annotated at all something may jump out but hard to know.


> kjetil
>
>
> > 		 	   		  


More information about the R-help mailing list