[Rd] strange apparently data-dependent crash with large data (PR#6955)

Duncan Murdoch dmurdoch at pair.com
Mon Jun 7 20:14:57 CEST 2004


On Mon,  7 Jun 2004 18:59:27 +0200 (CEST), tplate at blackmesacapital.com
wrote :

>I'm consistently seeing R crash with a particular large data set.  What's 
>strange is that although the crash seems related to running out of memory, 
>I'm unable to construct a pseudo-random data set of the same size that also 
>causes the crash.  Further adding to the strangeness is that the crash only 
>happens if the dataset goes through a save()/load() cycle -- without that, 
>the command in question just gives an out-of-memory error, but does not crash.

This kind of error is very difficult to debug.  What's likely
happening is that in one case you run out of memory at a place with a
correct check, and in the other you are hitting some flaky code that
assumes every memory allocation is guaranteed to succeed.

You could install DrMinGW (which produces a stack dump when you
crash), but it's not necessarily informative:  often the crash occurs
relatively distantly from the buggy code that caused it.

The other problem with this kind of error is that it may well
disappear if you run under a debugger, since that will make you run
out of memory at a different spot, and it may not appear on a
different machine.  For example, I ran your examples and they all
failed because R ran out of memory, but none crashed.

Duncan Murdoch



More information about the R-devel mailing list