[R] gbm question

David Katz dkatz at tibco.com
Tue Nov 22 00:35:25 CET 2016


R-Help,

Please help me understand why these models and predictions are different:


library(gbm)

set.seed(32321)
 N <- 1000
     X1 <- runif(N)
     X2 <- 2*runif(N)
     X3 <- ordered(sample(letters[1:4],N,replace=TRUE),levels=letters[4:1])
     X4 <- factor(sample(letters[1:6],N,replace=TRUE))
     X5 <- factor(sample(letters[1:3],N,replace=TRUE))
     X6 <- 3*runif(N)
     mu <- c(-1,0,1,2)[as.numeric(X3)]

     SNR <- 10 # signal-to-noise ratio
     Y <- X1**1.5 + 2 * (X2**.5) + mu
     sigma <- sqrt(var(Y)/SNR)
     Y <- Y + rnorm(N,0,sigma)

     # introduce some missing values
     X1[sample(1:N,size=500)] <- NA
     X4[sample(1:N,size=300)] <- NA

     data <- data.frame(Y=Y,X1=X1,X2=X2,X3=X3,X4=X4,X5=X5,X6=X6)

set.seed(32321)
gbm.formula <-
     gbm(Y~X1+X2+X3+X4+X5+X6,         # formula
         data=data,                   # dataset
         distribution="gaussian",     # see the help for other choices
         n.trees=1000,                # number of trees
         shrinkage=0.05,              # shrinkage or learning rate,
                                      # 0.001 to 0.1 usually work
         interaction.depth=3,         # 1: additive model, 2: two-way
interactions, etc.
         bag.fraction = 0.5,          # subsampling fraction, 0.5 is
probably best
         train.fraction = 1,        # fraction of data for training,
                                      # first train.fraction*N used for
training
         n.minobsinnode = 10,         # minimum total weight needed in each
node
         keep.data=TRUE,              # keep a copy of the dataset with the
object
         verbose=FALSE)               # don't print out progress



set.seed(32321)
gbm.Fit <-
     gbm.fit(x=data[,-1],y=Y,
         distribution="gaussian",     # see the help for other choices
         n.trees=1000,                # number of trees
         shrinkage=0.05,              # shrinkage or learning rate,
                                      # 0.001 to 0.1 usually work
         interaction.depth=3,         # 1: additive model, 2: two-way
interactions, etc.
         bag.fraction = 0.5,          # subsampling fraction, 0.5 is
probably best
         nTrain=length(Y),
                                      # first train.fraction*N used for
training
         n.minobsinnode = 10,         # minimum total weight needed in each
node
         keep.data=TRUE,              # keep a copy of the dataset with the
object
         verbose=FALSE)               # don't print out progress





all.equal(predict(gbm.formula,n.trees=100), predict(gbm.Fit,n.trees=100))

> [1] "Mean relative difference: 0.3585409"

#all.equal(gbm.formula,gbm.Fit) no!

(Based on the package examples)

Thanks

*David Katz*| IAG, TIBCO Spotfire

	[[alternative HTML version deleted]]



More information about the R-help mailing list