[Rd] Missing objects using dump.frames for post-mortem debugging of crashed batch jobs. Bug or gap in documentation?

nospam at altfeld-im.de nospam at altfeld-im.de
Sun Nov 13 13:11:38 CET 2016

Dear R friends,

to allow post-mortem debugging In my Rscript based batch jobs I use

   tryCatch( <R expression>,
          error = function(e)
            dump.frames(to.file = TRUE)

to write the called frames into a dump file.

This is similar to the method recommended in the "Writing R extensions"
manual in section 4.2 Debugging R code (page 96):


> options(error = quote({dump.frames(to.file=TRUE); q()}))

When I load the dump later in a new R session to examine the error I use

    load(file = "last.dump.rda")

My problem is that the global objects in the workspace are NOT contained
in the dump since "dump.frames" does not save the workspace.

This makes debugging difficult.

For more details see the stackoverflow question + answer in:

I think the reason of the problem is:

If you use dump.files(to.file = FALSE) in an interactive session
debugging works as expected because it creates a global variable called
"last.dump" and the workspace is still loaded.

In the batch job scenario however the workspace is NOT saved in the dump
and therefore lost if you debug the dump in a new session.

Options to solve the issue:

1. Improve the documentation of the R help for "dump.frames" and the
   R_exts manual to propose another code snippet for batch
   job scenarios:

      save.image(file = "last.dump.rda")

2. Change the semantics of "dump.frames(to.file = TRUE)" to include
   the workspace in the dump.
   This would change the semantics implied by the function name
   but makes the semantics consistent for both "to.file" param values.

More information about the R-devel mailing list