[R] Memory problem

jim holtman jholtman at gmail.com
Wed Apr 6 15:57:19 CEST 2016


You say it is "getting stored"; is this in memory or on disk?  How are you
processing the results of the 1,000 simulations?

So some more insight into the actual process would be useful.  For example,
how are the simulations being done, are the results stored in memory, or
out to a file, what are you doing with the results at the end, etc.


Jim Holtman
Data Munger Guru

What is the problem that you are trying to solve?
Tell me what you want to do, not how you want to do it.

On Wed, Apr 6, 2016 at 8:44 AM, Amelia Marsh <amelia_marsh08 at yahoo.com>
wrote:

> Dear Sir,
>
> Thanks for the guidance. Will check. And yes, at the end of each
> simulation, a large result is getting stored.
>
> Regards
>
> Amelia
>
>
> On Wednesday, 6 April 2016 5:48 PM, jim holtman <jholtman at gmail.com>
> wrote:
>
>
> It is hard to tell from the information that you have provided.  Do you
> have a list of the sizes of all the objects that you have in memory?  Are
> you releasing large objects at the end of each simulation run?  Are you
> using 'gc' to garbage collect any memory after deallocating objects?
> Collect some additional information with a simple function like below:
>
> f_mem_stats <- function(memo='') cat(memo, proc.time(), memory.size(),
> '\n')
>
>
> > f_mem_stats(2)
> 2 2.85 11.59 85444.93 NA NA 39.08
>
> This will print out what you pass in as a parameter, e.g., the iteration
> number, and then outputs the amount of CPU and memory used so far.  I use
> this all the time to keep track of resource consumption in long running
> scripts.
>
>
> Jim Holtman
> Data Munger Guru
>
> What is the problem that you are trying to solve?
> Tell me what you want to do, not how you want to do it.
>
> On Wed, Apr 6, 2016 at 7:39 AM, Amelia Marsh via R-help <
> r-help at r-project.org> wrote:
>
> Dear R Forum,
>
> I have about 2000+ FX forward transactions and I am trying to run 1000
> simulations. If I use less no of simulations, I am able to get the desired
> results. However, when I try to use more than 1000 simulations, I get
> following error.
>
> > sorted2 <- ddply(sorted, .(currency_from_exch, id), mutate,
> change_in_mtm_bc = mtm_bc - mtm_bc[1])
>
> Error: cannot allocate vector of size 15.6 Mb
>
>
> In addition: Warning messages:
> 1: Reached total allocation of 3583Mb: see help(memory.size)
> 2: Reached total allocation of 3583Mb: see help(memory.size)
> 3: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 4: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 5: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 6: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 7: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
> 8: In output[[var]][rng] <- df[[var]] :
> Reached total allocation of 3583Mb: see help(memory.size)
>
>
> When I checked -
>
> > memory.size()
> [1] 846.83
> > memory.limit()
> [1] 3583
>
>
> The code is bit lengthy and unfortunately can't be shared.
>
> Kindly guide how this memory probelm can be tackled? I am using R x64 3.2.0
>
> Regards
>
> Amelia
>
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> <http://www.r-project.org/posting-guide.html>
> and provide commented, minimal, self-contained, reproducible code.
>
>
>
>
>

	[[alternative HTML version deleted]]



More information about the R-help mailing list