Abstract
In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.
| Original language | English (US) |
|---|---|
| Article number | 022001 |
| Journal | Journal of Physics A: Mathematical and Theoretical |
| Volume | 47 |
| Issue number | 2 |
| DOIs | |
| State | Published - Jan 17 2014 |
| Externally published | Yes |
Keywords
- Bellman-Hamilton-Jacobi equation
- gauge transformations
- non-equilibrium statistical physics
- stochastic optimal control
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Statistics and Probability
- Modeling and Simulation
- Mathematical Physics
- General Physics and Astronomy