[R] lmer and mixed effects logistic regression

Rick Bilonick rab45+ at pitt.edu
Thu Jun 22 01:01:50 CEST 2006


On Wed, 2006-06-21 at 08:35 -0700, Spencer Graves wrote:
> 	  You could think of 'lmer(..., family=binomial)' as doing a separate 
> "glm" fit for each subject, with some shrinkage provided by the assumed 
> distribution of the random effect parameters for each subject.  Since 
> your data are constant within subject, the intercept in your model 
> without the subject's random effect distribution will be estimated at 
> +/-Inf.  Since this occurs for all subjects, the maximum likelihood 
> estimate of the subject variance is Inf, which is what I wrote in an 
> earlier contribution to this thread.
> 
> 	  What kind of answer do you get from SAS NLMIXED?  If it does NOT tell 
> you that there is something strange about the estimation problem you've 
> given it, I would call that a serious infelicity in the code.  If it is 
> documented behavior, some might argue that it doesn't deserve the "B" 
> word ("Bug").  The warning messages issued by 'lmer' in this case are 
> something I think users would want, even if they are cryptic.
> 
> 	  Hope this helps.
> 	  Spencer Graves	
> 
I did send in an example with data set that duplicates the problem.
Changing the control parameters allowed lmer to produce what seem like
reasonable estimates. Even for the case with essentially duplicate
pairs, lmer and NLMIXED produce similar estimates (finite intercepts
also) although lmer's coefficient estimates are as far as I can tell the
same as glm but the standard errors are larger.

The problem I really want estimates for is different from this one
explanatory factor example.  The model I estimate will have several
explanatory factors, including factors that differ within each subject
(although the responses within each subject are the same). BTW, as far
as I know, the responses could be different within a subject but it
seems to be very rare.


Possibly the example I thought I sent never made it to the list. The
example is below.

Rick B.

###########################################################################
# Example of lmer error message


I made an example data set that exhibits the error. There is a dump of
the data frame at the end.

First, I updated all my packages:

> sessionInfo()
Version 2.3.1 (2006-06-01)
i686-redhat-linux-gnu

attached base packages:
[1] "methods"   "stats"     "graphics"  "grDevices" "utils"
"datasets"
[7] "base"

other attached packages:
     chron       lme4     Matrix    lattice
   "2.3-3"  "0.995-2" "0.995-11"   "0.13-8"

But I still get the error.

For comparison, here is what glm gives:


> summary(glm(y~x,data=example.df,family=binomial))

Call:
glm(formula = y ~ x, family = binomial, data = example.df)

Deviance Residuals:
    Min       1Q   Median       3Q      Max
-1.6747  -0.9087  -0.6125   1.1447   2.0017

Coefficients:
            Estimate Std. Error z value Pr(>|z|)
(Intercept)  -0.4786     0.1227  -3.901 9.59e-05 ***
x             0.7951     0.1311   6.067 1.31e-09 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

(Dispersion parameter for binomial family taken to be 1)

    Null deviance: 436.63  on 324  degrees of freedom
Residual deviance: 394.15  on 323  degrees of freedom
AIC: 398.15

Number of Fisher Scoring iterations: 4


Running lmer without any tweaks:

> (lmer(y~(1|id)+x,data=example.df,family=binomial))
Error in lmer(y ~ (1 | id) + x, data = example.df, family = binomial) :
        Leading minor of order 2 in downdated X'X is not positive
definite
In addition: Warning message:
nlminb returned message singular convergence (7)
 in: LMEopt(x = mer, value = cv)

Running lmer with list(msVerbose=TRUE):

> (lmer(y~(1|
id)+x,data=example.df,family=binomial,control=list(msVerbose=TRUE)))
  0     -545.002:  44801.6
  1     -545.002:  44801.6
  2     -545.002:  44801.6
  3     -545.003:  44801.9
  4     -545.014:  44805.2
  5     -545.123:  44838.3
  6     -546.208:  45168.3
  7     -556.572:  48444.8
  8     -628.932:  78993.4
  9     -699.716:  127441.
 10     -771.102:  206437.
 11     -842.258:  333880.
 12     -913.501:  540319.
 13     -984.712:  874202.
 14     -1055.93: 1.41452e+06
 15     -1127.15: 2.28873e+06
 16     -1198.37: 3.70326e+06
 17     -1269.59: 5.99199e+06
 18     -1340.81: 9.69524e+06
 19     -1412.03: 1.56872e+07
 20     -1483.25: 2.53825e+07
 21     -1554.47: 4.10697e+07
 22     -1625.69: 6.64522e+07
 23     -1696.91: 1.07522e+08
 24     -1768.13: 1.73974e+08
 25     -1839.35: 2.81496e+08
 26     -1910.57: 4.55470e+08
 27     -1981.78: 7.36966e+08
 28     -2053.00: 1.19244e+09
 29     -2124.22: 1.92940e+09
 30     -2195.44: 3.12184e+09
 31     -2266.66: 5.05124e+09
 32     -2337.88: 8.17308e+09
 33     -2409.10: 1.32243e+10
 34     -2480.32: 2.13974e+10
 35     -2551.54: 3.46217e+10
 36     -2622.76: 5.60190e+10
 37     -2693.98: 9.06405e+10
 38     -2765.20: 1.46659e+11
 39     -2836.42: 2.37299e+11
 40     -2907.64: 3.83962e+11
 41     -2978.85: 6.21253e+11
 42     -3050.07: 1.00521e+12
 43     -3121.28: 1.62645e+12
 44     -3192.47: 2.63147e+12
 45     -3263.70: 4.25757e+12
 46     -3334.89: 6.88953e+12
 47     -3406.11: 1.11441e+13
 48     -3477.22: 1.80392e+13
 49     -3548.36: 2.91492e+13
 50     -3619.76: 4.72269e+13
 51     -3690.52: 7.63668e+13
 52     -3761.36: 1.23295e+14
 53     -3832.63: 1.99577e+14
 54     -3900.88: 3.22856e+14
 55     -3968.08: 4.97009e+14
  0     -4067.06: 1.67844e+15
  1     -4067.06: 1.67844e+15
  0     -4265.60: 5.77607e+15
  1     -4265.60: 5.77607e+15
  0     -4474.52: 1.96098e+16
  1     -4474.52: 1.96098e+16
  0     -4723.57: 6.68597e+16
  1     -4723.57: 6.68597e+16
  0     -4985.37: 2.20089e+17
  1     -4985.37: 2.20089e+17
  0     -5268.68: 7.69417e+17
  1     -5268.68: 7.69417e+17
  0     -5536.64: 2.48775e+18
  1     -5536.64: 2.48775e+18
  0     -5853.10: 8.45248e+18
  1     -5853.10: 8.45248e+18
  0     -6197.46: 3.00106e+19
  1     -6197.46: 3.00106e+19
  0     -6400.09: 8.72855e+19
  1     -6400.09: 8.72855e+19
  0     -6769.87: 3.19354e+20
  1     -6769.87: 3.19354e+20
  0     -7085.60: 1.14993e+21
  1     -7085.60: 1.14993e+21
  0     -7414.58: 4.43964e+21
  1     -7414.58: 4.43964e+21
  0     -7665.61: 1.61085e+22
  1     -7665.61: 1.61085e+22
Error in lmer(y ~ (1 | id) + x, data = example.df, family = binomial,  :
        Leading minor of order 2 in downdated X'X is not positive
definite
In addition: Warning message:
nlminb returned message singular convergence (7)
 in: LMEopt(x = mer, value = cv)


Running lmer with method="Laplace" and
control=list(usePQL=FALSE,msVerbose=TRUE):

> (lmer(y~(1|id)+x,data=example.df,family=binomial,method="Laplace",
+   control=list(usePQL=FALSE,msVerbose=TRUE)))
  0      347.321: -0.478643 0.795145  1.45231
  1      334.637: -0.775380  1.49795  2.09885
  2      326.045: -0.631955 0.917513  2.90042
  3      307.930: -0.627581  1.85085  4.66928
  4      304.717: -1.06671  1.40101  5.11069
  5      299.588: -1.05336  1.85102  5.73305
  6      297.157: -0.682292  1.60623  6.35949
  7      282.629: -1.33421  1.86152  10.2167
  8      270.279: -1.44945  2.72297  14.8450
  9      263.248: -1.61188  3.21518  19.5257
 10      254.336: -1.89092  4.01520  29.0932
 11      248.253: -2.13096  4.72573  39.9024
 12      243.359: -2.39747  5.49392  53.8331
 13      239.255: -2.66754  6.31763  71.9027
 14      235.865: -2.91894  7.17523  94.3541
 15      232.831: -3.14279  8.11396  123.501
 16      230.229: -3.32800  9.12440  159.978
 17      227.957: -3.45824  10.1876  205.312
 18      225.987: -3.50977  11.2006  258.137
 19      223.822: -3.42383  12.2016  327.929
 20      222.281: -3.29714  12.9668  393.939
 21      218.687: -2.35417  15.1107  657.987
 22      217.978: -2.00284  15.3087  724.381
 23      216.828: -1.03243  15.3436  883.159
 24      216.641: -0.727910  15.0860  924.584
 25      216.561: -0.634457  14.8052  935.901
 26      216.477: -0.670831  14.4966  934.259
 27      216.335: -0.882568  14.1066  925.552
 28      216.153: -1.24388  13.9061  926.647
 29      215.914: -1.70066  14.0769  966.092
 30      215.643: -2.07605  14.7379  1073.14
 31      215.365: -2.25220  15.8379  1261.63
 32      215.169: -2.20650  16.9633  1485.79
 33      215.065: -2.05998  17.7714  1685.40
 34      214.993: -1.85386  18.2239  1859.43
 35      214.948: -1.69235  18.3198  1985.48
 36      214.933: -1.65586  18.2629  2051.34
 37      214.933: -1.65578  18.2629  2051.34
 38      214.933: -1.65579  18.2629  2051.34
 39      214.933: -1.65586  18.2629  2051.34
 40      214.933: -1.65654  18.2625  2051.34
 41      214.932: -1.66423  18.2585  2051.33
 42      214.931: -1.70783  18.2351  2051.33
 43      214.931: -1.73215  18.2201  2051.43
 44      214.931: -1.74205  18.2078  2051.65
 45      214.930: -1.73708  18.2686  2076.43
 46      214.929: -1.73209  18.3805  2120.39
 47      214.929: -1.73283  18.3612  2112.76
 48      214.929: -1.73334  18.3600  2112.79
 49      214.929: -1.73332  18.3600  2112.79
 50      214.929: -1.73332  18.3600  2112.79
 51      214.929: -1.73332  18.3600  2112.79
 52      214.929: -1.73332  18.3600  2112.79
 53      214.929: -1.73332  18.3600  2112.79
 54      214.929: -1.73332  18.3600  2112.79
Generalized linear mixed model fit using Laplace
Formula: y ~ (1 | id) + x
          Data: example.df
 Family: binomial(logit link)
      AIC      BIC    logLik deviance
 220.9293 232.2807 -107.4646 214.9293
Random effects:
 Groups Name        Variance Std.Dev.
 id     (Intercept) 2112.8   45.965
number of obs: 325, groups: id, 177

Estimated scale (compare to 1)  0.06664838

Fixed effects:
            Estimate Std. Error  z value Pr(>|z|)
(Intercept)  -1.7333     5.7142 -0.30333  0.76164
x            18.3600     7.3318  2.50416  0.01227 *
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Correlation of Fixed Effects:
  (Intr)
x -0.382

Note that the results for x don't agree at all with what glm outputs.
The estimated scale is very small and the sd for id appears to be very
large.


Now changing method="Laplace" to method="ML":

> (lmer(y~(1|id)+x,data=example.df,family=binomial,method="ML",
+   control=list(usePQL=FALSE,msVerbose=TRUE)))
Generalized linear mixed model fit using PQL
Formula: y ~ (1 | id) + x
          Data: example.df
 Family: binomial(logit link)
      AIC      BIC    logLik deviance
 353.3209 364.6724 -173.6604 347.3209
Random effects:
 Groups Name        Variance Std.Dev.
 id     (Intercept) 1.4523   1.2051
number of obs: 325, groups: id, 177

Estimated scale (compare to 1)  0.2372670

Fixed effects:
            Estimate Std. Error z value  Pr(>|z|)
(Intercept) -0.47864    0.16114 -2.9703  0.002975 **
x            0.79514    0.16872  4.7128 2.444e-06 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Correlation of Fixed Effects:
  (Intr)
x -0.129

The estimated coefficients are the same as glm to 4 decimal places. The
se's are about 30% larger than for glm. The sd for id is much smaller
and the scale is larger.

If I try to turn PQL back on I get the error message.


I used ML and PQL off on the original data set and the results are
ROUGHLY similar to what SAS NLMIXED gives but the coefficient for x is
about 20% lower than NLMIXED. I haven't had a chance to run NLMIXED on
the example data frame yet.

Finally, besides my thanks for the help and apologies for the length of
this post, here is the dump of the data frame:


example.df <-
structure(list(id = structure(as.integer(c(1, 1, 2, 2, 3, 3,
4, 4, 5, 5, 6, 6, 7, 7, 8, 9, 9, 10, 10, 11, 11, 12, 12, 13,
14, 14, 15, 15, 16, 16, 17, 17, 18, 18, 19, 19, 20, 20, 21, 21,
22, 22, 23, 23, 24, 24, 25, 25, 26, 26, 27, 28, 29, 29, 30, 30,
31, 31, 32, 32, 33, 33, 34, 34, 35, 35, 36, 36, 37, 37, 38, 38,
39, 39, 40, 40, 41, 42, 42, 43, 43, 44, 45, 45, 46, 46, 47, 47,
48, 48, 49, 49, 50, 50, 51, 51, 52, 52, 53, 53, 54, 54, 55, 55,
56, 56, 57, 57, 58, 58, 59, 59, 60, 61, 61, 62, 62, 63, 63, 64,
64, 65, 65, 66, 66, 67, 67, 68, 69, 69, 70, 70, 71, 71, 72, 72,
73, 73, 74, 75, 75, 76, 76, 77, 77, 78, 78, 79, 79, 80, 81, 81,
82, 82, 83, 83, 84, 85, 85, 86, 86, 87, 87, 88, 88, 89, 89, 90,
90, 91, 91, 92, 92, 93, 94, 95, 95, 96, 97, 97, 98, 98, 99, 99,
100, 101, 101, 102, 102, 103, 103, 104, 104, 105, 105, 106, 106,
107, 107, 108, 108, 109, 109, 110, 111, 111, 112, 112, 113, 113,
114, 114, 115, 116, 116, 117, 118, 118, 119, 120, 120, 121, 121,
122, 123, 123, 124, 124, 125, 125, 126, 126, 127, 127, 128, 128,
129, 129, 130, 131, 131, 132, 133, 133, 134, 134, 135, 136, 136,
137, 138, 138, 139, 139, 140, 140, 141, 141, 142, 142, 143, 143,
144, 144, 145, 145, 146, 146, 147, 148, 148, 149, 149, 150, 150,
151, 151, 152, 152, 153, 153, 154, 154, 155, 155, 156, 157, 157,
158, 159, 160, 161, 161, 162, 162, 163, 163, 164, 164, 165, 165,
166, 166, 167, 167, 168, 168, 169, 169, 170, 170, 171, 171, 172,
172, 173, 173, 174, 174, 175, 175, 176, 176, 177, 177)), .Label = c("1",
"2", "3", "4", "5", "6", "7", "8", "9", "10", "11", "12", "13",
"14", "15", "16", "17", "18", "19", "20", "21", "22", "23", "24",
"25", "26", "27", "28", "29", "30", "31", "32", "33", "34", "35",
"36", "37", "38", "39", "40", "41", "42", "43", "44", "45", "46",
"47", "48", "49", "50", "51", "52", "53", "54", "55", "56", "57",
"58", "59", "60", "61", "62", "63", "64", "65", "66", "67", "68",
"69", "70", "71", "72", "73", "74", "75", "76", "77", "78", "79",
"80", "81", "82", "83", "84", "85", "86", "87", "88", "89", "90",
"91", "92", "93", "94", "95", "96", "97", "98", "99", "100",
"101", "102", "103", "104", "105", "106", "107", "108", "109",
"110", "111", "112", "113", "114", "115", "116", "117", "118",
"119", "120", "121", "122", "123", "124", "125", "126", "127",
"128", "129", "130", "131", "132", "133", "134", "135", "136",
"137", "138", "139", "140", "141", "142", "143", "144", "145",
"146", "147", "148", "149", "150", "151", "152", "153", "154",
"155", "156", "157", "158", "159", "160", "161", "162", "163",
"164", "165", "166", "167", "168", "169", "170", "171", "172",
"173", "174", "175", "176", "177"), class = "factor"), y =
structure(as.integer(c(1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 2,
2, 2, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 2, 2, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1, 1, 2, 2, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1,
1, 1, 2, 2, 1, 1, 2, 1, 1, 1, 1, 2, 2, 1, 1, 1, 1, 2, 2, 2, 2,
2, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 1, 1, 1, 2, 2, 2, 2, 2, 2, 1,
1, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2,
1, 1, 1, 1, 2, 2, 2, 2, 2, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 2, 1,
1, 2, 2, 2, 2, 1, 1, 2, 2, 1, 1, 1, 1, 2, 2, 2, 1, 1, 1, 1, 2,
2, 2, 1, 1, 2, 1, 1, 2, 2, 2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 2, 2,
2, 2, 1, 1, 1, 1, 2, 2, 2, 2, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1, 2,
2, 1, 1, 2, 2, 1, 1, 2, 2, 2, 2, 1, 1, 2, 2, 2, 1, 1, 1, 1, 1,
1, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1, 1, 2, 2, 2, 1, 1, 2, 2,
2, 2, 2, 2, 2, 2, 1, 1, 1, 1, 1, 1, 1, 1, 2, 2, 1, 1, 1, 1, 1,
1, 1, 1, 2, 2, 1, 1, 1, 1)), .Label = c("0", "1"), class = "factor"),
    x = c(0.896492660264945, 0.896492660264945, 1.59446707642661,
    1.59446707642661, -1.05008338359102, -1.05008338359102,
1.09348658068790,
    1.09348658068790, 1.12507994528403, 1.12507994528403,
0.276572438987850,
    0.276572438987850, 0.434273771509725, 0.434273771509725,
    2.09093423436586, -0.622643744937437, -0.622643744937437,
    -0.58706802345943, -0.58706802345943, 0.124446406100572,
    0.124446406100572, 0.126570329770903, 0.126570329770903,
    0.181261364281855, 1.64039692579746, 1.64039692579746,
-0.555474658863302,
    -0.555474658863302, 0.47542479262234, 0.47542479262234,
-0.258656325934905,
    -0.258656325934905, 1.64995458231394, 1.64995458231394,
0.696047363877697,
    0.696047363877697, -1.46716889435175, -1.46716889435175,
    1.03375122745992, 1.03375122745992, 0.790827457666109,
0.790827457666109,
    -1.53194856629677, -1.53194856629677, -1.69389774615931,
    -1.69389774615931, -0.811141970679076, -0.811141970679076,
    -0.582289195201196, -0.582289195201196, 0.00789609469130197,
    0.573390771916238, -1.45628378554133, -1.45628378554133,
    -1.16079290490689, -1.16079290490689, -0.832646697841153,
    -0.832646697841153, -0.241930427031072, -0.241930427031072,
    -1.29353813430241, -1.29353813430241, -0.663794766050042,
    -0.663794766050042, -0.961940551272396, -0.961940551272396,
    1.59499805734419, 1.59499805734419, 0.47144243574048,
0.47144243574048,
    0.952245656611064, 0.952245656611064, -0.304586175305754,
    -0.304586175305754, 0.71463169599307, 0.71463169599307,
-1.32141463247548,
    -0.0983000888251169, -0.0983000888251169, 0.440114561603134,
    0.440114561603134, -1.75761545626916, -0.409985887445808,
    -0.409985887445808, -0.847514163533447, -0.847514163533447,
    -1.09229636653880, -1.09229636653880, -1.00415353422017,
    -1.00415353422017, -1.63681729751924, -1.63681729751924,
    -1.74354446195324, -1.74354446195324, -1.65460515825824,
    -1.65460515825824, -1.30760912861834, -1.30760912861834,
    -1.38300841891499, -1.38300841891499, -0.628750025489628,
    -0.628750025489628, 0.323564250193864, 0.323564250193864,
    -0.524412275184748, -0.524412275184748, -0.486181649118838,
    -0.486181649118838, 0.142234266839580, 0.142234266839580,
    -1.74965074250543, -1.74965074250543, -0.299010875671146,
    2.01049062535218, 2.01049062535218, -1.18229763206896,
-1.18229763206896,
    0.83304044061388, 0.83304044061388, -1.44539867673089,
-1.44539867673089,
    -0.391136064871645, -0.391136064871645, -0.118477363693237,
    -0.118477363693237, -1.73531425773072, -1.73531425773072,
    0.748083493800747, -1.70717226909887, -1.70717226909887,
    -0.210602552893726, -0.210602552893726, 0.681976369561778,
    0.681976369561778, -0.0138741229295563, -0.0138741229295563,
    -0.532111498489687, -0.532111498489687, -0.585740571165474,
    -0.202106858212414, -0.202106858212414, -0.121663249198723,
    -0.121663249198723, -0.328214826138161, -0.328214826138161,
    -0.94468367145097, -0.94468367145097, -1.4807089077501,
-1.4807089077501,
    1.09083167609999, 1.15215997208072, 1.15215997208072,
1.55411252669037,
    1.55411252669037, 0.0299318027709619, 0.0299318027709619,
    -0.0913973368965427, -1.40716805066498, -1.40716805066498,
    -0.246178274371723, -0.246178274371723, 0.473035378493218,
    0.473035378493218, 0.221084933100514, 0.221084933100514,
    -0.297152442459607, -0.297152442459607, 0.487106372809148,
    0.487106372809148, -0.434676500113372, -0.434676500113372,
    -1.18760744124478, -1.18760744124478, 0.937643681377551,
    1.05737987829232, -0.0879459609322652, -0.0879459609322652,
    0.310289727254308, -0.9587546657669, -0.9587546657669,
1.61889219863538,
    1.61889219863538, 0.983573530748409, 0.983573530748409,
0.229580627781825,
    -0.394587440835922, -0.394587440835922, 1.27163067853669,
    1.27163067853669, 1.40649983160254, 1.40649983160254,
-0.275116734379947,
    -0.275116734379947, 1.86526734439348, 1.86526734439348,
1.72668132490455,
    1.72668132490455, 0.929147986696238, 0.929147986696238,
-0.738397584970334,
    -0.738397584970334, 1.38260569031136, 1.38260569031136,
0.869412633468261,
    0.426574548204786, 0.426574548204786, 0.906846788157797,
    0.906846788157797, 0.697109325712863, 0.697109325712863,
    1.11578777922635, 1.11578777922635, 1.36242841544324,
1.20101021649827,
    1.20101021649827, 1.37676490021795, 0.76480939270458,
0.76480939270458,
    1.86393989209952, 0.543124859614057, 0.543124859614057,
-0.379985465602419,
    -0.379985465602419, 1.04224692214123, 1.85411674512425,
1.85411674512425,
    -0.251753574006341, -0.251753574006341, 0.813394146663342,
    0.813394146663342, 0.405335311501501, 0.405335311501501,
    0.590913142196445, 0.590913142196445, -0.263435154193149,
    -0.263435154193149, -1.73690720048346, -1.73690720048346,
    1.55092664118487, 1.10649561316866, 1.10649561316866,
0.454716536836637,
    -0.675741836695642, -0.675741836695642, 0.91959033017976,
    0.91959033017976, -0.256532402264574, -0.383967822484279,
    -0.383967822484279, -0.7036183348687, -1.07955282451682,
    -1.07955282451682, -0.640431605676436, -0.640431605676436,
    -1.48389479325559, -1.48389479325559, 1.72747779628092,
1.72747779628092,
    -0.959816627602066, -0.959816627602066, 0.562771153564595,
    0.562771153564595, 0.830651026484758, 0.830651026484758,
    0.126039348853320, 0.126039348853320, -0.753265050662627,
    -0.753265050662627, 0.570735867328324, 1.56101527861893,
    1.56101527861893, -0.701228920739579, -0.701228920739579,
    0.272059101188397, 0.272059101188397, -0.570607615014388,
    -0.570607615014388, -1.05539319276684, -1.05539319276684,
    -1.09442029020913, -1.09442029020913, -1.68035773276096,
    -1.68035773276096, -0.523350313349583, -0.523350313349583,
    -0.142106014525635, -1.24256396621453, -1.24256396621453,
    0.440380052061925, -0.138389148102566, 0.354626633872418,
    0.294094809268057, 0.294094809268057, 1.84349712677261,
1.84349712677261,
    -1.02857865642895, -1.02857865642895, -0.0266176649515298,
    -0.0266176649515298, 0.699233249383193, 0.699233249383193,
    0.950387223399524, 0.950387223399524, 0.350113296072966,
    0.350113296072966, 0.440114561603134, 0.440114561603134,
    -0.487774591871586, -0.487774591871586, 1.14074388235270,
    1.14074388235270, 0.797199228677091, 0.797199228677091,
-0.831053755088405,
    -0.831053755088405, 0.477283225833879, 0.477283225833879,
    1.13384113042414, 1.13384113042414, 0.607108060182704,
0.607108060182704,
    0.191084511257124, 0.191084511257124, -1.54814348428303,
    -1.54814348428303)), .Names = c("id", "y", "x"), row.names = c("4",
"101", "5", "102", "6", "103", "7", "104", "1", "98", "8", "105",
"9", "106", "198", "199", "263", "10", "107", "11", "108", "200",
"264", "197", "12", "109", "201", "265", "202", "266", "203",
"267", "204", "268", "205", "269", "2", "99", "3", "100", "206",
"270", "16", "113", "17", "114", "18", "115", "19", "116", "117",
"20", "21", "118", "22", "119", "23", "120", "24", "121", "25",
"122", "26", "123", "27", "124", "28", "125", "29", "126", "30",
"127", "31", "128", "207", "271", "32", "33", "129", "208", "272",
"130", "34", "131", "35", "132", "36", "133", "37", "134", "38",
"135", "39", "136", "13", "110", "40", "137", "41", "138", "42",
"139", "209", "273", "43", "140", "44", "141", "210", "274",
"45", "142", "211", "46", "143", "14", "111", "212", "275", "47",
"144", "48", "145", "213", "276", "214", "277", "215", "49",
"146", "50", "147", "216", "278", "217", "279", "218", "280",
"51", "52", "148", "219", "281", "220", "282", "221", "283",
"53", "149", "222", "223", "284", "54", "150", "55", "151", "152",
"56", "153", "57", "154", "58", "155", "224", "285", "225", "286",
"226", "287", "15", "112", "59", "156", "289", "290", "228",
"291", "229", "60", "157", "230", "292", "231", "293", "294",
"232", "295", "233", "296", "61", "158", "234", "297", "235",
"298", "62", "159", "236", "299", "63", "160", "64", "161", "300",
"237", "301", "65", "162", "66", "163", "238", "302", "303",
"67", "164", "304", "68", "165", "239", "240", "305", "241",
"306", "307", "242", "308", "69", "166", "70", "167", "227",
"288", "243", "309", "73", "170", "74", "171", "247", "248",
"312", "249", "75", "172", "250", "313", "244", "76", "173",
"174", "77", "175", "251", "314", "78", "176", "252", "315",
"79", "177", "253", "316", "254", "317", "80", "178", "255",
"318", "319", "81", "179", "82", "180", "83", "181", "84", "182",
"85", "183", "86", "184", "71", "168", "256", "320", "185", "87",
"186", "257", "258", "321", "88", "187", "259", "322", "260",
"323", "245", "310", "261", "324", "89", "188", "90", "189",
"91", "190", "92", "191", "246", "311", "93", "192", "94", "193",
"95", "194", "96", "195", "262", "325", "97", "196", "72", "169"
), class = "data.frame")



More information about the R-help mailing list