[R] Getting choropleth map intervals correct

LCOG1 jroll at lcog.org
Wed Mar 24 01:16:16 CET 2010


Hello all, 
Working on mapping some probabilities using R to a geographic unit called a
TAZ.    The below data will work but you will have to set your directory for
the shape file.  Never did this before so hopefully this works.  ResProbs is
just supposed to be a value between 0-1, sorry if that more complicated than
it needed to be.  


TazFile <- "*directory*/TAZ.shp"
TazShape <- readShapeSpatial(TazFile)
TazShape<-TazShape[order(TazShape$TAZ_NUM),]

ResTaz<-25:666
ResProbs<-rnorm(642,0:1)
ResProbs[ResProbs>1]=.5
ResProbs[ResProbs<-1]=.2
ResProbs<-abs(ResProbs)

ResProbs..<-data.frame(ResTaz,ResProbs)
names(ResProbs..)<-c("Taz","SFsubM")
TazShape$SFsubM<-ResProbs..$SFsubM[match(TazShape$TAZ_NUM,ResProbs..$Taz)]
brks<-cut(TazShape$SFsubM, breaks=c(seq(0,1, by=0.1), Inf) ,
right=TRUE,include.lowest=TRUE)
cols <- grey((length(brks):2)/length(brks))
plot(TazShape, col=cols[findInterval(TazShape$SFsubM, brks,
all.inside=TRUE)])

I get the error:
Error in findInterval(TazShape$SFsubM, brks, all.inside = TRUE) : 
  'vec' must be sorted non-decreasingly

The code i took this from created "brks" as a quantile returning:
   0%        10%        20%        30%        40%        50%        60% 
0.00000000 0.03858501 0.07693546 0.11647164 0.14702968 0.18308665 0.22484961 
       70%        80%        90%       100% 
0.26566555 0.31217598 0.39463130 0.73439360 

which is not what i want but rather putting data into the correct "bins".  I
remedied this by the above code describing "brks", which now returns an
interval.  

So basically what im after is to illustrate probabilities for each of my
geographic units for my shape file with breaks at each .1, using gray or any
other color for that matter.  Thanks

  http://n4.nabble.com/file/n1679914/TAZ.shp TAZ.shp 
-- 
View this message in context: http://n4.nabble.com/Getting-choropleth-map-intervals-correct-tp1679914p1679914.html
Sent from the R help mailing list archive at Nabble.com.



More information about the R-help mailing list