Stochastic optimal control as non-equilibrium statistical mechanics: Calculus of variations over density and current

Vladimir Y. Chernyak, Michael Chertkov, Joris Bierkens, Hilbert J. Kappen

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.

Original languageEnglish (US)
Article number022001
JournalJournal of Physics A: Mathematical and Theoretical
Volume47
Issue number2
DOIs
StatePublished - Jan 17 2014
Externally publishedYes

Keywords

  • Bellman-Hamilton-Jacobi equation
  • gauge transformations
  • non-equilibrium statistical physics
  • stochastic optimal control

ASJC Scopus subject areas

  • Statistical and Nonlinear Physics
  • Statistics and Probability
  • Modeling and Simulation
  • Mathematical Physics
  • General Physics and Astronomy

Fingerprint

Dive into the research topics of 'Stochastic optimal control as non-equilibrium statistical mechanics: Calculus of variations over density and current'. Together they form a unique fingerprint.

Cite this