[Rd] Interpreting R memory profiling statistics from Rprof() and gc()

Joy joyousjoyyy at gmail.com
Thu May 18 18:54:35 CEST 2017


Sorry, this might be a really basic question, but I'm trying to interpret
the results from memory profiling, and I have a few questions (marked by
*Q#*).

From the summaryRprof() documentation, it seems that the four columns of
statistics that are reported when setting memory.profiling=TRUE are
- vector memory in small blocks on the R heap
- vector memory in large blocks (from malloc)
- memory in nodes on the R heap
- number of calls to the internal function duplicate in the time interval
(*Q1:* Are the units of the first 3 stats in bytes?)

and from the gc() documentation, the two rows represent
- ‘"Ncells"’ (_cons cells_), usually 28 bytes each on 32-bit systems and 56
bytes on 64-bit systems,
- ‘"Vcells"’ (_vector cells_, 8 bytes each)
(*Q2:* how are Ncells and Vcells related to small heap/large heap/memory in
nodes?)

And I guess the question that lead to these other questions is - *Q3:* I'd
like to plot out the total amount of memory used over time, and I don't
think Rprofmem() give me what I'd like to know because, as I'm
understanding it, Rprofmem() records the amount of memory allocated with
each call, but this doesn't tell me the total amount of memory R is using,
or am I mistaken?

Thanks in advance!

Joy

	[[alternative HTML version deleted]]



More information about the R-devel mailing list