We consider the problem of steering a linear dynamical system with complete state observation from an initial Gaussian distribution in state-space to a final one with minimum energy control. The system is stochastically driven through the control channels; an example for such a system is that of an inertial particle experiencing random "white noise'' forcing. We show that a target probability distribution can always be achieved in finite time. The optimal control is given in state-feedback form and is computed explicitly by solving a pair of differential Lyapunov equations that are coupled through their boundary values. This result, given its attractive algorithmic nature, appears to have several potential applications such as to quality control, industrial processes as well as to active control of nanomechanical systems and molecular cooling. The problem to steer a diffusion process between end-point marginals has a long history (Schroedinger bridges) and therefore, the present case of steering a linear stochastic system constitutes a Schroedinger bridge for possibly degenerate diffusions. Our results, however, provide the first {\em implementable} form of the optimal control for a general Gauss-Markov process. Illustrative examples of the optimal evolution and control for inertial particles and a stochastic oscillator are provided. A final result establishes directly the property of Schroedinger bridges as the most likely random evolution between given marginals to the present context of linear stochastic systems.

Optimal Steering of a Linear Stochastic System to a Final Probability Distribution, Part I

PAVON, MICHELE
2016

Abstract

We consider the problem of steering a linear dynamical system with complete state observation from an initial Gaussian distribution in state-space to a final one with minimum energy control. The system is stochastically driven through the control channels; an example for such a system is that of an inertial particle experiencing random "white noise'' forcing. We show that a target probability distribution can always be achieved in finite time. The optimal control is given in state-feedback form and is computed explicitly by solving a pair of differential Lyapunov equations that are coupled through their boundary values. This result, given its attractive algorithmic nature, appears to have several potential applications such as to quality control, industrial processes as well as to active control of nanomechanical systems and molecular cooling. The problem to steer a diffusion process between end-point marginals has a long history (Schroedinger bridges) and therefore, the present case of steering a linear stochastic system constitutes a Schroedinger bridge for possibly degenerate diffusions. Our results, however, provide the first {\em implementable} form of the optimal control for a general Gauss-Markov process. Illustrative examples of the optimal evolution and control for inertial particles and a stochastic oscillator are provided. A final result establishes directly the property of Schroedinger bridges as the most likely random evolution between given marginals to the present context of linear stochastic systems.
File in questo prodotto:
File Dimensione Formato  
1408.2222.pdf

accesso aperto

Descrizione: pdf
Tipologia: Published (publisher's version)
Licenza: Creative commons
Dimensione 1.21 MB
Formato Adobe PDF
1.21 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11577/3216965
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 151
  • ???jsp.display-item.citation.isi??? 125
  • OpenAlex ND
social impact