[R] R: Re: R: Re: R: Re: Differences in output of lme() when introducing interactions

John Kane jrkrideau at inbox.com
Tue Jul 21 13:39:31 CEST 2015


Have you been asking statistics related questiongs on StackExchange?

I must say I had the luxury when at school that we had a very strong (free) stats consulting service. I was the envy of several friends at other universities and I suspect we (many depts of the university) turned out better work.

John Kane
Kingston ON Canada


> -----Original Message-----
> From: angelo.arcadi at virgilio.it
> Sent: Tue, 21 Jul 2015 12:12:58 +0200 (CEST)
> To: lists at dewey.myzen.co.uk, bgunter.4567 at gmail.com
> Subject: [R] R: Re: R: Re: R: Re: Differences in output of lme() when
> introducing interactions
> 
> Dear Michael,
> thanks a lot. I am studying the marginality and I came across to this
> post:
> 
> http://www.ats.ucla.edu/stat/r/faq/type3.htm
> 
> Do you think that the procedure there described is the right one to solve
> my problem?
> 
> Would you have any other online resources to suggest especially dealing
> with R?
> 
> My department does not have a statician, so I have to find a solution
> with my own capacities.
> 
> Thanks in advance
> 
> Angelo
> 
> 
> 
> 
> ----Messaggio originale----
> Da: lists at dewey.myzen.co.uk
> Data: 21-lug-2015 11.58
> A: "angelo.arcadi at virgilio.it"<angelo.arcadi at virgilio.it>,
> <bgunter.4567 at gmail.com>
> Cc: <r-help at r-project.org>
> Ogg: Re: R: Re: [R] R: Re: Differences in output of lme() when
> introducing interactions
> 
> Dear Angelo
> 
> I suggest you do an online search for marginality which may help to
> explain the relationship between main effects and interactions. As I
> said in my original email this is a complicated subject which we are not
> going to retype for you.
> 
> If you are doing this as a student I suggest you sue your university for
> failing to train you appropriately and if it is part of your employment
> I suggest you find a better employer.
> 
> On 21/07/2015 10:04, angelo.arcadi at virgilio.it wrote:
>> Dear Bert,
>> thank you for your feedback. Can you please provide some references
>> online so I can improve "my ignorance"?
>> Anyways, please notice that it is not true that I do not know statistics
>> and regressions at all, and I am strongly
>> convinced that my question can be of interest for some one else in the
>> future.
>> 
>> This is what forums serve for, isn't it? This is why people help each
>> other, isn't it?
>> 
>> Moreover, don't you think that I would not have asked to this R forum if
>> I had the possibility to ask or pay a statician?
>> Don't you think I have done already my best to study and learn before
>> posting this message? Trust me, I have read different
>> online tutorials on lme and lmer, and I am confident that I have got the
>> basic concepts. Still I have not found the answer
>> to solve my problem, so if you know the answer can you please give me
>> some suggestions that can help me?
>> 
>> I do not have a book where to learn and unfortunately I have to analyze
>> the results soon. Any help? Any online reference to-the-point
>> that can help me in solving this problem?
>> 
>> Thank you in advance
>> 
>> Best regards
>> 
>> Angelo
>> 
>> 
>>     ----Messaggio originale----
>>     Da: bgunter.4567 at gmail.com
>>     Data: 21-lug-2015 3.45
>>     A: "angelo.arcadi at virgilio.it"<angelo.arcadi at virgilio.it>
>>     Cc: <lists at dewey.myzen.co.uk>, <r-help at r-project.org>
>>     Ogg: Re: [R] R: Re: Differences in output of lme() when introducing
>>     interactions
>> 
>>     I believe Michael's point is that you need to STOP asking such
>>     questions and START either learning some statistics or work with
>>     someone who already knows some. You should not be doing such
>> analyses
>>     on your own given your present state of statistical ignorance.
>> 
>>     Cheers,
>>     Bert
>> 
>> 
>>     Bert Gunter
>> 
>>     "Data is not information. Information is not knowledge. And
>> knowledge
>>     is certainly not wisdom."
>>         -- Clifford Stoll
>> 
>> 
>>     On Mon, Jul 20, 2015 at 5:45 PM, angelo.arcadi at virgilio.it
>>     <angelo.arcadi at virgilio.it> wrote:
>>      > Dear Michael,
>>      > thanks for your answer. Despite it answers to my initial
>>     question, it does not help me in finding the solution to my problem
>>     unfortunately.
>>      >
>>      > Could you please tell me which analysis of the two models should
>>     I trust then?
>>      > My goal is to know whether participants’ choices
>>      >  of the dependent variable are linearly related to their own
>>     weight, height, shoe size and
>>      >  the combination of those effects.
>>      > Would the analysis of model 2 be more
>>      > correct than that of model 1? Which of the two analysis should I
>>     trust according to my goal?
>>      > What is your recommendation?
>>      >
>>      >
>>      > Thanks in advance
>>      >
>>      > Angelo
>>      >
>>      >
>>      >
>>      >
>>      >
>>      > ----Messaggio originale----
>>      > Da: lists at dewey.myzen.co.uk
>>      > Data: 20-lug-2015 17.56
>>      > A: "angelo.arcadi at virgilio.it"<angelo.arcadi at virgilio.it>,
>>     <r-help at r-project.org>
>>      > Ogg: Re: [R] Differences in output of lme() when introducing
>>     interactions
>>      >
>>      > In-line
>>      >
>>      > On 20/07/2015 15:10, angelo.arcadi at virgilio.it wrote:
>>      >> Dear List Members,
>>      >>
>>      >>
>>      >>
>>      >> I am searching for correlations between a dependent variable and
>> a
>>      >> factor or a combination of factors in a repeated measure design.
>>     So I
>>      >> use lme() function in R. However, I am getting very different
>>     results
>>      >> depending on whether I add on the lme formula various factors
>>     compared
>>      >> to when only one is present. If a factor is found to be
>> significant,
>>      >> shouldn't remain significant also when more factors are
>>     introduced in
>>      >> the model?
>>      >>
>>      >
>>      > The short answer is 'No'.
>>      >
>>      > The long answer is contained in any good book on statistics which
>> you
>>      > really need to have by your side as the long answer is too long
>> to
>>      > include in an email.
>>      >
>>      >>
>>      >> I give an example of the outputs I get using the two models. In
>>     the first model I use one single factor:
>>      >>
>>      >> library(nlme)
>>      >> summary(lme(Mode ~ Weight, data = Gravel_ds, random = ~1 |
>> Subject))
>>      >> Linear mixed-effects model fit by REML
>>      >>   Data: Gravel_ds
>>      >>        AIC      BIC   logLik
>>      >>    2119.28 2130.154 -1055.64
>>      >>
>>      >> Random effects:
>>      >>   Formula: ~1 | Subject
>>      >>          (Intercept) Residual
>>      >> StdDev:    1952.495 2496.424
>>      >>
>>      >> Fixed effects: Mode ~ Weight
>>      >>                  Value Std.Error DF   t-value p-value
>>      >> (Intercept) 10308.966 2319.0711 95  4.445299   0.000
>>      >> Weight        -99.036   32.3094 17 -3.065233   0.007
>>      >>   Correlation:
>>      >>         (Intr)
>>      >> Weight -0.976
>>      >>
>>      >> Standardized Within-Group Residuals:
>>      >>          Min          Q1         Med          Q3         Max
>>      >> -1.74326719 -0.41379593 -0.06508451  0.39578734  2.27406649
>>      >>
>>      >> Number of Observations: 114
>>      >> Number of Groups: 19
>>      >>
>>      >>
>>      >> As you can see the p-value for factor Weight is significant.
>>      >> This is the second model, in which I add various factors for
>>     searching their correlations:
>>      >>
>>      >> library(nlme)
>>      >> summary(lme(Mode ~ Weight*Height*Shoe_Size*BMI, data =
>>     Gravel_ds, random = ~1 | Subject))
>>      >> Linear mixed-effects model fit by REML
>>      >>   Data: Gravel_ds
>>      >>         AIC      BIC    logLik
>>      >>    1975.165 2021.694 -969.5825
>>      >>
>>      >> Random effects:
>>      >>   Formula: ~1 | Subject
>>      >>          (Intercept) Residual
>>      >> StdDev:    1.127993 2494.826
>>      >>
>>      >> Fixed effects: Mode ~ Weight * Height * Shoe_Size * BMI
>>      >>                                  Value Std.Error DF    t-value
>>     p-value
>>      >> (Intercept)                   5115955  10546313 95  0.4850941
>>     0.6287
>>      >> Weight                      -13651237   6939242  3 -1.9672518
>>     0.1438
>>      >> Height                         -18678     53202  3 -0.3510740
>>     0.7487
>>      >> Shoe_Size                       93427    213737  3  0.4371115
>>     0.6916
>>      >> BMI                         -13011088   7148969  3 -1.8199949
>>     0.1663
>>      >> Weight:Height                   28128     14191  3  1.9820883
>>     0.1418
>>      >> Weight:Shoe_Size               351453    186304  3  1.8864467
>>     0.1557
>>      >> Height:Shoe_Size                 -783      1073  3 -0.7298797
>>     0.5183
>>      >> Weight:BMI                      19475     11425  3  1.7045450
>>     0.1868
>>      >> Height:BMI                     226512    118364  3  1.9136867
>>     0.1516
>>      >> Shoe_Size:BMI                  329377    190294  3  1.7308827
>>     0.1819
>>      >> Weight:Height:Shoe_Size          -706       371  3 -1.9014817
>>     0.1534
>>      >> Weight:Height:BMI                -109        63  3 -1.7258742
>>     0.1828
>>      >> Weight:Shoe_Size:BMI             -273       201  3 -1.3596421
>>     0.2671
>>      >> Height:Shoe_Size:BMI            -5858      3200  3 -1.8306771
>>     0.1646
>>      >> Weight:Height:Shoe_Size:BMI         2         1  3  1.3891782
>>     0.2589
>>      >>   Correlation:
>>      >>                              (Intr) Weight Height Sho_Sz BMI
>>     Wght:H Wg:S_S Hg:S_S Wg:BMI Hg:BMI S_S:BM Wg:H:S_S W:H:BM W:S_S:
>> H:S_S:
>>      >> Weight                      -0.895
>>      >> Height                      -0.996  0.869
>>      >> Shoe_Size                   -0.930  0.694  0.933
>>      >> BMI                         -0.911  0.998  0.887  0.720
>>      >> Weight:Height                0.894 -1.000 -0.867 -0.692 -0.997
>>      >> Weight:Shoe_Size             0.898 -0.997 -0.873 -0.700 -0.999
>>     0.995
>>      >> Height:Shoe_Size             0.890 -0.612 -0.904 -0.991 -0.641
>>     0.609  0.619
>>      >> Weight:BMI                   0.911 -0.976 -0.887 -0.715 -0.972
>>     0.980  0.965  0.637
>>      >> Height:BMI                   0.900 -1.000 -0.875 -0.703 -0.999
>>     0.999  0.999  0.622  0.973
>>      >> Shoe_Size:BMI                0.912 -0.992 -0.889 -0.726 -0.997
>>     0.988  0.998  0.649  0.958  0.995
>>      >> Weight:Height:Shoe_Size     -0.901  0.999  0.876  0.704  1.000
>>     -0.997 -1.000 -0.623 -0.971 -1.000 -0.997
>>      >> Weight:Height:BMI           -0.908  0.978  0.886  0.704  0.974
>>     -0.982 -0.968 -0.627 -0.999 -0.975 -0.961  0.973
>>      >> Weight:Shoe_Size:BMI        -0.949  0.941  0.928  0.818  0.940
>>     -0.946 -0.927 -0.751 -0.980 -0.938 -0.924  0.935    0.974
>>      >> Height:Shoe_Size:BMI        -0.901  0.995  0.878  0.707  0.998
>>     -0.992 -1.000 -0.627 -0.960 -0.997 -0.999  0.999    0.964  0.923
>>      >> Weight:Height:Shoe_Size:BMI  0.952 -0.948 -0.933 -0.812 -0.947
>>     0.953  0.935  0.747  0.985  0.946  0.932 -0.943   -0.980 -0.999
>> -0.931
>>      >>
>>      >> Standardized Within-Group Residuals:
>>      >>          Min          Q1         Med          Q3         Max
>>      >> -2.03523736 -0.47889716 -0.02149143  0.41118126  2.20012158
>>      >>
>>      >> Number of Observations: 114
>>      >> Number of Groups: 19
>>      >>
>>      >>
>>      >> This time the p-value associated to Weight is not significant
>>     anymore. Why? Which analysis should I trust?
>>      >>
>>      >>
>>      >> In addition, while in the first output the field "value" (which
>>      >> should give me the slope) is -99.036 in the second output it is
>>      >> -13651237. Why they are so different? The one in the first
>>     output is the
>>      >>   one that seems definitively more reasonable to me.
>>      >> I would very grateful if someone could give me an answer
>>      >>
>>      >>
>>      >> Thanks in advance
>>      >>
>>      >>
>>      >> Angelo
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>
>>      >>       [[alternative HTML version deleted]]
>>      >>
>>      >> ______________________________________________
>>      >> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more,
>> see
>>      >> https://stat.ethz.ch/mailman/listinfo/r-help
>>      >> PLEASE do read the posting guide
>>     http://www.R-project.org/posting-guide.html
>>      >> and provide commented, minimal, self-contained, reproducible
>> code.
>>      >>
>>      >
>>      > --
>>      > Michael
>>      > http://www.dewey.myzen.co.uk/home.html
>>      >
>>      >
>>      >
>>      >
>>      >         [[alternative HTML version deleted]]
>>      >
>>      > ______________________________________________
>>      > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
>>      > https://stat.ethz.ch/mailman/listinfo/r-help
>>      > PLEASE do read the posting guide
>>     http://www.R-project.org/posting-guide.html
>>      > and provide commented, minimal, self-contained, reproducible
>> code.
>> 
>> 
> 
> --
> Michael
> http://www.dewey.myzen.co.uk/home.html
> 
> 
> 
> 
> 	[[alternative HTML version deleted]]
> 
> ______________________________________________
> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.

____________________________________________________________
FREE 3D MARINE AQUARIUM SCREENSAVER - Watch dolphins, sharks & orcas on your desktop!



More information about the R-help mailing list