[R] Data handling/optimum glm method.

abigailclifton at me.com abigailclifton at me.com
Thu Mar 29 13:12:08 CEST 2012

Hi there,

I am trying to fit a generalised linear model to some loan application and default data. The purpose of this is to eventually work out the probability an applicant will default.

However, R seems to crash or die when I run "glm" on anything greater than a 5-way saturated model for my data.

My first question: is the best way to fit a generalised linear model in R to fit the saturated model and extract the significant terms only, or to start at the null model and to work up to the optimum one?
I am importing a csv file with 3500 rows and 27 columns (3500x27 matrix).

My second question: is there anyway to increase the memory I have so R can cope with more analysis?

I can send my code if it would help to answer the question.

Kind regards,

Sent from my BlackBerry smartphone from Virgin Media

More information about the R-help mailing list