[R] Plot different regression models on one graph

Peter Ehlers ehlers at ucalgary.ca
Sat Feb 13 21:34:40 CET 2010


Rhonda:

As David points out, a cubic fit is rather speculative. I think that
one needs to distinguish two situations: 1) theoretical justification
for a cubic model is available, or 2) you're dredging the data for
the "best" fit. Your case is the second. That puts you in the realm
of EDA (exploratory data analysis). You're free to fit any model you
wish, but you should assess the value of the model. One convenient
way is to use the influence.measures() function, which will show
that points 9 and 10 in your data have a strong influence on your
cubic fit (as, of course, your eyes would tell you). A good thing
to do at this point is to fit 3 more cubic models:
1) omitting point 9, 2) omitting point 10, 3) omitting both.

Try it and plot the resulting fits. You may be surprised.

Conclusion: Without more data, you can conclude nothing
about a non-straightline fit.

(And this hasn't even addressed the relative abundance of x=0 data.)

  -Peter Ehlers

David Winsemius wrote:
> 
> On Feb 13, 2010, at 1:35 PM, Rhonda Reidy wrote:
> 
>> The following variables have the following significant relationships 
>> (x is the explanatory variable): linear, cubic, exponential, logistic. 
>> The linear relationship plots without any trouble.
>>
>> Cubic is the 'best' model, but it is not plotting as a smooth curve 
>> using the following code:
>>
>> cubic.lm<- lm(y~poly(x,3))
> 
> Try:
> 
> lines(0:80, predict(cubic.lm, data.frame(x=0:80)),lwd=2)
> 
> But I really must say the superiority of that relationship over a linear 
> one is far from convincing to my eyes. Seems to be caused by one data 
> point. I hope the quotes around "best" mean tha tyou have the same qualms.
> 
> 
>> lines(x,predict(cubic.lm),lwd=2)
>>
>> How do I plot the data and the estimated curves for all of these 
>> regression models in the same plot?
>>
>> x <- c(62.5,68.5,0,52,0,52,0,52,23.5,86,0,0,0,0,0,0,0,0,0,0)
>>
>> y <- 
>> c(0.054,0.055,0.017,0.021,0.020,0.028,0.032,0.073,0.076,0.087,0.042,0.042,0.041,0.045,0.021,0.018,0.017,0.018,0.028,0.022) 
>>
>>
>> Thanks in advance.
>>
>> Rhonda Reidy
>>

-- 
Peter Ehlers
University of Calgary



More information about the R-help mailing list