Friday, June 28, 2013

1306.6572 (Vladimir Y. Chernyak et al.)

Stochastic Optimal Control as Non-equilibrium Statistical Mechanics:
Calculus of Variations over Density and Current
   [PDF]

Vladimir Y. Chernyak, Michael Chertkov, Joris Bierkens, Hilbert J. Kappen
In Stochastic Optimal Control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.
View original: http://arxiv.org/abs/1306.6572

No comments:

Post a Comment