[R] Gaussian frailty leads to segmentation fault
christian.lederer at imse.med.tu-muenchen.de
Wed Jul 28 18:38:01 CEST 2004
Dear R gurus,
for a simulation concerning study effects and historical controls
in survival analysis, i would like to experiment with a gaussian
The simulated scenario consists of a randomized trial
(treatment and placebo) and historical controls (only placebo).
So the simulated data frames consist of four columns
$time, $cens, $study, $treat.
$time, $cens are the usual survival data.
For the binary thretment indicator we have
$treat == 0 or 1, if $study == 1,
$treat == 1 if $study > 1
Typical parameters for my simulations are:
sample sizes (per arm): between 100 and 200
number of historical studies: between 7 and 15
hazard ratio treatment/placebo: between 0.7 and 1
variance of the study effekt: between 0 and 0.3
Depending on the sample sizes, the following call sometimes leads to
a segmentation fault:
as.factor(treatment) + frailty(study, distribution="gaussian"),
I noticed, that this segmentation fault occures most frequently, if the
number of randomized treatment patients is higher than the number of
randomized placebo patients, and the number of historical studies is
There seems to be no problem, if there are at least as many randomized
placebo patients as treated patients. Unfortunately, this is not the
situation i want to investigate (historical controls should be used
to decrease the number of treated patients).
Is there a way to circumwent this problem?
Is it allowed, to attach gzipped sample data sets in this mailing list?
More information about the R-help