[R] R --nsize 2M runs havoc (under linux)

Martin Maechler maechler at stat.math.ethz.ch
Wed Oct 6 18:03:14 CEST 1999


>>>>> On 06 Oct 1999 15:05:43 +0200, Peter Dalgaard BSA <p.dalgaard at biostat.ku.dk> said:

    PD> Joerg Kindermann <Joerg.Kindermann at gmd.de> writes:
    >> Dear All,
    >> 
    >> I am running R version 0.65.0 under
    >> 
    >> a) Suse-Linux 6.1, and Suse-Linux 6.2, compiler gcc-2.95, CPUs
    >> pentium pro 200, 128MB, and pentium II 450, 128MB
    >> b) Solaris 5.7, compiler gcc-2.95, cpu SUN sparc, 4000MB
    >> 
    >> When I set --nsize to more than 1M, R's internal storage management
    >> runs havoc. gc() indicates the requested sizes, but the overall
    >> process size is much too big: Running R with --vsize 10M --nsize 3M
    >> will for example result in a process size of 63.276 MB! Using such
    >> an R process will lead to a segmentation fault sooner or later,
    >> usually inside the storage allocation routine of R. I cannot
    >> reproduce the strange behavior under Solaris, however.

I can reproduce that R will start swapping enormously when
--nsize is too large, even on Solaris.
However I don't think this is astonishing if you start a program that needs
about all the freely available "real" memory on your machine,
see below.

[in the mean time, I've read  Martyn Plummer's excellent answer..
 I am still sending this off -- interesting things at the end !]

    PD> Er, I get 63 thousand and something too, but that is in
    PD> *Kilo*bytes...  Are you sure you didn't misread the output of
    PD> 'top'??

63 thousand Kilobytes is about the same as the indicated
63.276 MB -- assuming the Joerg used dot "." to mean decimal point

Now what do we expect?
    help(Memory)  
starts with

  >> Memory Available for Data Storage
  >> 
  >>      R --vsize v --nsize n
  >> 
  >> Arguments:
  >> 
  >>        v: Use `v' bytes of heap memory
  >> 
  >>        n: Use `n' cons cells.

and tells you later that each Ncell needs 16 bytes (which is what ?gc says too)
i.e., --vsize 10M --nsize 3M
would need about 10M + 3M * 16 = 58 MB  +  memory of the "R engine"

However, here is the result of an experiment
        [SunOS 5.5.1 Generic_103640-08 sun4u sparc SUNW,Ultra-2]

			       SZ    RSS 
 --vsize 10M --nsize 4M     96936  84656
 --vsize 10M --nsize 3M     76456  64192
  --vsize 9M --nsize 3M     75432  64192
  --vsize 9M --nsize 2M     54952  43712
  --vsize 9M --nsize 1M     34472  23232
  --vsize 9M --nsize 1024K  34472  23232
  --vsize 9M --nsize  800K  29992  18752
  --vsize 9M --nsize  700K  27992  16752
  --vsize 8M --nsize  700K  26968  16752
  --vsize 5M --nsize  700K  23896  16752

Which indicates
1) --vsize uses byte units as ?Memory says
2) --nsize has `Ncell' units which now seem to use 
   20 bytes each.. instead of 16

   which *is* 20% -- why??

-- 
Martin Maechler <maechler at stat.math.ethz.ch>	http://stat.ethz.ch/~maechler/
Seminar fuer Statistik, ETH-Zentrum  LEO D10	Leonhardstr. 27
ETH (Federal Inst. Technology)	8092 Zurich	SWITZERLAND
phone: x-41-1-632-3408		fax: ...-1228			<><
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list