[R] books about MCMC to use MCMC R packages?

Christophe Pouzat christophe.pouzat at univ-paris5.fr
Fri Sep 23 11:36:15 CEST 2005

This is the same mail as the previous one with a visible bibliography 
this time (sorry)...


I don't know yet of any book which presents MCMC methods with R examples 
so I can't answer to this part of your question. But I can suggest some 
general references (see the attached BibTeX file for details):

My favorite starting point is Radford Neal review from 1993, you can 
download it from his web-site.

Julian Besag's 2000 working paper is also a good starting point 
especially for statisticians (you can also download it).

If you're not scared at seeing the minus log likelihood referred to as 
the energy you can take a look at the Physics literature (Sokal,  1996; 
Berg 2004 and 2004b).  It's a good way to learn about tricks physicists 
use to get faster relaxation of their chains, like simulated annealing 
and the replica exchange method / parallel tempering method. These 
tricks were apparently first found by statisticians (Geyer, 1991; Geyer 
& Thompson, 1995; Ogata, 1995; review by Iba, 2001) but don't seem to 
attract much attention in this community. In my experience they work 
spectacularly well.

Robert and Casella, 2004 is a thorough reference with a bit too much on 
reversible jump techniques and not enough on physicians tricks (in my 
opinion of course).

Liu, 2001 is a spectacular overview. He knows very well both the 
statistical and physical literatures. But it's often frustrating because 
not enough details are given (for slow guys like me at least).

Fishman, 1996 is very comprehensive with much more than MCMC (that he 
calls "random tours").

Finally a note of caution about MCMC method can be useful, see Ripley, 

I hope that helps,


PS: the bibliography

  Author         = {Neal, Radford M},
  Title          = {Probabilistic {I}nference {U}sing {M}arkov {C}hain
                   {M}onte {C}arlo {M}ethods},
  Institution    = {Department of Computer Science. University of Toronto},
  Number         = {CRG-TR-91-1},
  web            = {http://www.cs.toronto.edu/~radford/papers-online.html},
  year           = 1993

  Author         = {Besag, Julian},
  Title          = {Markov {C}hain {M}onte {C}arlo for {S}tatistical
  Type           = {Working Paper},
  Number         = {9},
  Abstract       = {These notes provide an introduction to Markov chain
                   Monte Carlo methods that are useful in both Bayesian
                   and frequentist statistical inference. Such methods
                   have revolutionized what can be achieved
                   computationally, primarily but not only in the Bayesian
                   paradigm. The account begins by describing ordinary
                   Monte Carlo methods, which, in principle, have exactly
                   the same goals as the Markov chain versions but can
                   rarely be implemented. Subsequent sections describe
                   basic Markov chain Monte Carlo, founded on the Hastings
                   algorithm and including both the Metropolis method and
                   the Gibbs sampler as special cases, and go on to
                   discuss more recent developments. These include Markov
                   chain Monte Carlo p-values, the Langevin-Hastings
                   algorithm, auxiliary variables techniques, perfect
                   Markov chain Monte Carlo via coupling from the past,
                   and reversible jumps methods for target spaces of
                   varying dimensions. Specimen applications, drawn from
                   several different disciplines, are described throughout
                   the notes. Several of these appear for the first time.
                   All computations use APL as the programming language,
                   though this is not necessarily a recommendation! The
                   author welcomes comments and criticisms.},
  eprint         = {http://www.csss.washington.edu/Papers/wp9.pdf},
  URL            = {http://www.csss.washington.edu/Papers/},
  month          = sep,
  year           = 2000

  Author         = {Liu, Jun S.},
  Title          = {Monte {C}arlo {S}trategies in {S}cientific {C}omputing},
  Publisher      = {Springer Verlag},
  Series         = {Springer Series in Statistics},
  Edition        = {First},
  year           = 2001

  Author         = {Robert, Christian P. and Casella, George},
  Title          = {Monte {C}arlo statistical methods},
  Publisher      = {Springer-Verlag},
  Series         = {Springer Texts in Statistics},
  Address        = {New York},
  Edition        = {Second},
  isbn           = {0-387-21239-6},
  year           = 2004

  Author         = {Sokal, A.},
  Title          = {Monte {C}arlo methods in statistical mechanics:
                   foundations and new algorithms},
  BookTitle      = {Functional integration (Carg\`ese, 1996)},
  Publisher      = {Plenum},
  Volume         = {361},
  Series         = {NATO Adv. Sci. Inst. Ser. B Phys.},
  Pages          = {131--192},
  Address        = {New York},
  mrclass        = {82B80 (82B05 82B26 82B27)},
  mrnumber       = {MR1477456 (98k:82101)},
  mrreviewer     = {Emilio N. M. Cirillo},
  url            = {http://citeseer.nj.nec.com/sokal96monte.html},
  year           = 1997

  Author         = "Berg, Bernd A.",
  Title          = "Introduction to Markov Chain Monte Carlo Simulations
                   and their Statistical Analysis",
  eprint         = "cond-mat/0410490",
  Abstract       = {This article is a tutorial on Markov chain Monte Carlo
                   simulations and their statistical analysis. The
                   theoretical concepts are illustrated through many
                   numerical assignments from the author's book on the
                   subject. Computer code (in Fortran) is available for
                   all subjects covered and can be downloaded from the
  url            = {http://fr.arxiv.org/abs/cond-mat/0410490},
  year           = 2004

  Author         = {Berg, Bernd A.},
  Title          = {Markov {C}hain {M}onte {C}arlo {S}imulations and their
                   {S}tatistical {A}nalysis},
  Publisher      = {World Scientific Publishing Company},
  isbn           = {9812389350},
  url            = {http://www.worldscibooks.com/physics/5602.html},
  year           = 2004

  Author         = "Iba, Yukito",
  Title          = "Extended Ensemble Monte Carlo",
  Journal        = "Int. J. Mod. Phys.",
  Volume         = "C12",
  Pages          = "623-656",
  eprint         = "cond-mat/0012323",
  Abstract       = {``Extended Ensemble Monte Carlo''is a generic term
                   that indicates a set of algorithms which are now
                   popular in a variety of fields in physics and
                   statistical information processing. Exchange Monte
                   Carlo (Metropolis-Coupled Chain, Parallel Tempering),
                   Simulated Tempering (Expanded Ensemble Monte Carlo),
                   and Multicanonical Monte Carlo (Adaptive Umbrella
                   Sampling) are typical members of this family. Here we
                   give a cross-disciplinary survey of these algorithms
                   with special emphasis on the great flexibility of the
                   underlying idea. In Sec.2, we discuss the background of
                   Extended Ensemble Monte Carlo. In Sec.3, 4 and 5, three
                   types of the algorithms, i.e., Exchange Monte Carlo,
                   Simulated Tempering, Multicanonical Monte Carlo are
                   introduced. In Sec.6, we give an introduction to
                   Replica Monte Carlo algorithm by Swendsen and Wang.
                   Strategies for the construction of special-purpose
                   extended ensembles are discussed in Sec.7. We stress
                   that an extension is not necessary restricted to the
                   space of energy or temperature. Even unphysical
                   (unrealizable) configurations can be included in the
                   ensemble, if the resultant fast mixing of the Markov
                   chain offsets the increasing cost of the sampling
                   procedure. Multivariate (multi-component) extensions
                   are also useful in many examples. In Sec.8, we give a
                   survey on extended ensembles with a state space whose
                   dimensionality is dynamically varying. In the appendix,
                   we discuss advantages and disadvantages of three types
                   of extended ensemble algorithms.},
  url            = {http://fr.arxiv.org/abs/cond-mat/0012323},
  year           = 2001

  Author         = {Geyer, C. J.},
  Title          = {Markov chain {M}onte {C}arlo maximum likelihood},
  BookTitle      = {Computing {S}cience and {S}tatistics: {P}roc. 23rd
                   {S}ymp. {I}nterface},
  Pages          = {156-163},
  year           = 1991

  Author         = {Geyer, Charles J. and Thompson, Elizabeth A.},
  Title          = {Annealing {M}arkov {C}hain {M}onte {C}arlo with
                   {A}pplications to {A}ncestral {I}nference},
  Journal        = {Journal of the American Statistical Association},
  Volume         = {90},
  Number         = {431},
  Pages          = {909--920},
  eprint         = 
  Abstract       = {Markov chain Monte Carlo (MCMC; the
                   Metropolis-Hastings algorithm) has been used for many
                   statistical problems, including Bayesian inference,
                   likelihood inference, and tests of significance. Though
                   the method generally works well, doubts about
                   convergence often remain. Here we propose MCMC methods
                   distantly related to simulated annealing. Our samplers
                   mix rapidly enough to be usable for problems in which
                   other methods would require eons of computing time.
                   They simulate realizations from a sequence of
                   distributions, allowing the distribution being
                   simulated to vary randomly over time. If the sequence
                   of distributions is well chosen, then the sampler will
                   mix well and produce accurate answers for all the
                   distributions. Even when there is only one distribution
                   of interest, these annealing-like samplers may be the
                   only known way to get a rapidly mixing sampler. These
                   methods are essential for attacking very hard problems,
                   which arise in areas such as statistical genetics. We
                   illustrate the methods with an application that is much
                   harder than any problem previously done by MCMC,
                   involving ancestral inference on a very large genealogy
                   (7 generations, 2,024 individuals). The problem is to
                   find, conditional on data on living individuals, the
                   probabilities of each individual having been a carrier
                   of cystic fibrosis. Exact calculation of these
                   conditional probabilities is infeasible. Moreover, a
                   Gibbs sampler for the problem would not mix in a
                   reasonable time, even on the fastest imaginable
                   computers. Our annealing-like samplers have mixing
                   times of a few hours. We also give examples of samplers
                   for the "witch's hat" distribution and the conditional
                   Strauss process.},
  month          = sep,
  year           = 1995

  Author         = {Ogata, Yosihiko},
  Title          = {Markov {C}hain {M}onte {C}arlo {I}ntegration {T}hrough
                   {S}imulated {A}nnealing and {I}ts {A}pplication to
                   {L}ikelihood {C}omputation of {B}ayesian {M}odels},
  Journal        = {Bull. Int. Stat. Inst.},
  Volume         = {56},
  Number         = {4},
  Pages          = {1873-1891},
  year           = 1995

  Author         = {Ripley, B. D.},
  Title          = {Pattern {R}ecognition and {N}eural {N}etworks},
  Publisher      = {Cambridge University Press},
  url            = {http://www.stats.ox.ac.uk/~ripley/PRbook/},
  year           = 1996

  Author         = {Fishman, George S},
  Title          = {Monte {C}arlo. {C}oncepts, {A}lgorithms, and
  Publisher      = {Springer Verlag},
  Series         = {Springer Series in Opreations Research},
  Edition        = {First},
  year           = 1996

Molins, Jordi wrote:

>Dear list users,
>I need to learn about MCMC methods, and since there are several packages in
>R that deal with this subject, I want to use them. 
>I want to buy a book (or more than one, if necessary) that satisfies the
>following requirements:
>- it teaches well MCMC methods;
>- it is easy to implement numerically the ideas of the book, and notation
>and concepts are similar to the corresponding R packages that deal with MCMC
>I have done a search and 2 books seem to satisfy my requirements:
>- Markov Chain Monte Carlo In Practice, by W.R. Gilks and others.
>- Monte Carlo Statistical methods, Robert and Casella.
>What do people think about these books? Is there a suggestion of some other
>book that could satisfy better my requirements?
>Thank you very much in advance.
>The information contained herein is confidential and is inte...{{dropped}}
>R-help at stat.math.ethz.ch mailing list
>PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html

A Master Carpenter has many tools and is expert with most of them.If you
only know how to use a hammer, every problem starts to look like a nail.
Stay away from that trap.
Richard B Johnson.

Christophe Pouzat
Laboratoire de Physiologie Cerebrale
UFR biomedicale de l'Universite Paris V
45, rue des Saints Peres
75006 PARIS

tel: +33 (0)1 42 86 38 28
fax: +33 (0)1 42 86 38 30
web: www.biomedicale.univ-paris5.fr/physcerv/C_Pouzat.html

More information about the R-help mailing list