Computing marginals of probability distributions (magnetizations, spin-spin correlations
etc) is an important problem in statistical physics of disordered systems and many
other fields. In decision problems one typically wants to estimate the likelihood
that a single variable takes a given value, given the interactions with all the
other variables, then fix that variable to its most likely value, and so estimate the
most likely configuration of all the variables. An important technical application
is decoding in communication systems, where the task is to estimate the code word
most likely to have been sent from the recieved signal and what one knows about the
channel.
In principle one can compute the marginals of any Markov chain by Monte Carlo, but
such methods are inherently slow for large systems. Over the last decade there has
therefore been intense interest in approximate methods to compute marginals of probability
distributions deterministically (the cavity method, Belief Propagation, iterative decoding),
which overall work by nodes (representing variables) sending messages between each other
and so collectively yield sometimes exact and often very good estimates of the marginals. In
the language of physics all these methods compute a variational ansatz of the Boltzmann-Gibbs
free energy of an equilibrium statistical mechanical system where the messages are related to
Lagrange multipliers.
I will describe recent work on extending message-passing from computing the stationary
states of Markov chains obeying detailed balance to Markov chains which do not. This
"dynamic cavity method" works to some extent, and in particular outperforms dynamic
mean-field methods on standard kinetic (non-equilibrium) Ising systems.
This is joint work with Hamed Mahmoudi (2011a-c), building on earlier work by Montanari
and co-workers (2009) and Neri and Bolle (2009).