Has anyone considered the possible advantages of Hamiltonian Monte
Carlo (http://www.cs.toronto.edu/~radford/) or Reimann manifold Langevin
Monte Carlo (http://www.dcs.gla.ac.uk/inference/rmhmc/) in the context
of mixed modeling?
The latter paper refers to early work by Bates and Watts (1980), and it
appears that the study of geometric aspects of regression goes back
(at least) to the Ph.D. thesis of D.M. Bates, so perhaps readers of this
list can provide some historical perspective.
A key advantage of HMC is that it can prevent getting stuck executing
a Metropolis random walk by taking more directed steps that have a
high probability of acceptance. The downside is that a lot of tuning is
required, and the new Reimann manifold method is advertised to
address this issue by using the intrinsic geometry to automate
some of this manual fine tuning (at the considerable expense of
But it is never really established why an intrinsic geometric
approach is practically relevant (and worth the
expense). A similar concern was expressed by some of the
comments following Bates and Watts (1980): to a geometer
a squashed beer can is the same as a full one, and a donut
is the same as a coffee cup.
It may be very helpful conceptually to understand the
underlying intrinsic geometric structure, but if this is all
that mattered there would be no advantage to eigenvalue
decompositions, QR transformations, and other change
of variables that obviously have very practical consequences.
Thus another question is what is the current applied view
on the geometric ideas that were discussed in
Bates and Watts (1980)? The same journal issue included
a number of papers on the geometry of statistical
inference including one by Efron.
My (possibly incorrect) impression is that well-designed
adaptive coordinate transformations have overshadowed
the use of intrinsic methods.
Finally, it seems like there is a link between HMC and
mixed modeling because the introduction of the momenum
variables looks a lot like the introduction of random effects.