exact test for large tables. Was: [R] unexpected R crash - again

kjetil halvorsen kjetilh at umsanet.edu.bo
Mon Aug 28 17:37:21 CEST 2000


Hola!

But this is already implemented in R, look at ?chisq.test
(but say library(ctest) first.)  This uses patefields 
algorithm from 1981, which simulates the contingency table conditional
on
its marginals, and is very fast.

Kjetil Halvorsen.

Yudi Pawitan wrote:
> 
> If you have an actual large table to analyse, rather
> than trying to solve the space problem, you may want
> to consider a MonteCarlo implementation
> of the exact test. Very easy to implement in R. See,
> for example, Lange's Numerical Analysis for Statisticians,
> Section 21.7.
> 
> -Yudi-
> 
> At 04:10 PM 8/25/00 -0500, Andy Jaworski <apjaworski at mmm.com>
>  wrote:
> >Sorry, but I lost this thread, so I sending this as a new message.
> >
> >This is really a follow-up to a post from a couple days ago saying that
> >fisher.test from the ctest library crashed on the following data set:
> >> T
> >      [,1] [,2]
> > [1,]    2    1
> > [2,]    2    1
> > [3,]    4    0
> > [4,]    8    0
> > [5,]    6    0
> > [6,]    0    0
> > [7,]    1    0
> > [8,]    1    1
> > [9,]    7    1
> >[10,]    8    2
> >[11,]    1    0
> >[12,]    3    1
> >[13,]    1    1
> >[14,]    3    0
> >[15,]    7    2
> >[16,]    4    1
> >[17,]    2    0
> >[18,]    2    0
> >[19,]    2    0
> >
> >Then Professor Ripley responded that the problem was not related to the
> >Windows port (as the original author implied) but to the amount of memory R
> >was started with.
> >
> >I just ran some tests on this I have to report that I am getting crashes
> >every time I try this problem.
> >(1) On my WinNT machine I got up to
> >         Rgui.exe --vsize=220M --nsize=4000k
> >     which produces
> >> gc()
> >           free    total  (Mb)
> >Ncells  3857498  4000000  76.3
> >Vcells 28790374 28835840 220.0
> >
> >and it still crashes with the "referenced memory cannot be read" message.  I
> >also noticed that the error message comes immediately, no matter how large
> >the
> >memory allocation is.  This machine is a Pentium Pro 200Mhz with 192Mb of
> >memory
> >
> >(2) On my Linux box I can only go to about --vsize=120M --nsize=1000k.  This
> >is a Pentium MMX 266Mhz with 64Mb of memory.  The machine crashes
> >     after "thinking" for about 3 seconds.  It produces segmentation
> >violation error and dumps core.
> >
> >It seems to me that, unless I have some unlucky machines, the problem is not
> >just handling too small memory allocation ungracefully.
> >
> >Any comments will be appreciated,
> >
> >Andy
> >
> >
> >-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
> .-.-
> >r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
> >Send "info", "help", or "[un]subscribe"
> >(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
> >_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
> ._._
> >
> Yudi Pawitan     yudi at stat.ucc.ie
> Department of Statistics UCC
> Cork, Ireland
> Ph   353-21-490 2906
> Fax 353-21-427 1040
> -.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
> r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
> Send "info", "help", or "[un]subscribe"
> (in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
> _._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._



More information about the R-help mailing list