# [R] GLM weights for the Poisson family

Rolf Turner r.turner at auckland.ac.nz
Wed Feb 5 02:41:18 CET 2014

```You should direct your inquiry to R-help, not to me personally.  I am
taking the liberty of cc-ing my reply back to the list.

I really haven't the time at the moment to think the issue through
thoroughly, but off the top of my head:  If you are going to use
weighted log likelihoods then any comparison of models that you engage
in should involve the *same* weights, otherwise you doing the good old
apples-with-oranges thing.

So yes, the weights will change the log-likelihood and the AIC.  And so
they should.  And if you use AIC to compare models which are
meaningfully comparable (id est, have the same weights) this is not a
problem.

As I say, this is off the top of my head.  Others older (???) and wiser
than I may correct me.

cheers,

Rolf Turner

On 05/02/14 11:56, IamRandom wrote:
> I am trying to do weighted Poisson regression.  I have count data.
>
> Simple example:
> set.seed(50)
> x=seq(0,1,length=100)
> y=numeric(100)
> y[seq(1,100,by=2)]=round(exp(1.5*x[seq(1,100,by=2)]+rnorm(50,0,.1)),0)
> y[seq(2,100,by=2)]=round(exp(1.5*x[seq(1,100,by=2)]+rnorm(50,0,1)),0)
> weigh1=numeric(100)
> weigh1[seq(1,100,by=2)]=rep(5,50)
> weigh1[seq(2,100,by=2)]=rep(1,50)
>
>
> The -2*loglikelihood of both of these regressions is the same with lm,
> which makes sense. The scaling of the weights does not affect the
> log-likelihood.
>  >-2*logLik( lm(y~x, weights=weigh1))[1]
>  >-2*logLik( lm(y~x, weights=weigh1/3))[1]
>
> The -2*loglikelihood of these two regressions are different with glm:
>  > -2*logLik(glm(y~x, family="poisson", weights=weigh1))[1]
>  > -2*logLik(glm(y~x, family="poisson", weights=weigh1/3))[1]
>
> This means that the AIC and other model comparison techniques with this
> weighted Poisson regression are dependent on the scaling of the
> weights.  So I assume I misunderstand what the "weights" are doing in
> the glm function.
>
> -Tracy
>
>
>
> On 2/4/2014 12:56 PM, Rolf Turner wrote:
>>
>> On 04/02/14 20:12, IamRandom wrote:
>>
>>> I am running a simple example of GLM.  If I include weights when
>>> family="poisson" then the weights are calculated iteratively and
>>> \$weights and \$prior.weights return different values.  The \$prior.weights
>>> are what I supplied and \$weights are the "posterior" weights of the
>>> IWLS.  If I include weights with family="gaussian" then the weights are
>>> static and \$weights and \$prior.weights return the same values; it seems
>>> to ignore IWLS algorithm procedure.  I really want the family="poisson"
>>> to behave like the family="gaussian" and use the static weights.
>>> Thoughts?
>>
>> As far as I understand things, your desideratum makes no sense. The
>> prior weights and the just-plain-weights are very different creatures.
>> The reason they wind up being the same for the gaussian family is that
>> for the gaussian family the likelihood is maximized by least squares;
>> there is no need for iteration or for re-weighting.
>>
>> The poisson family cannot behave like the gaussian family because for
>> the poisson family (or any family *other* than gaussian) iteration is
>> necessary in order to maximize the likelihood.
>>
>> You might get some insight into what's going on if you were to read
>> Annette Dobson's book "An Introduction to Generalized Linear Models"
>> (Chapman and Hall, 1990).
>>
>> cheers,
>>
>> Rolf Turner
>>
>>
>>
>

```