Stochastic nonlinear control: A unified framework for stability, dissipativity, and optimality
MetadataShow full item record
In this work, we develop connections between stochastic stability theory and stochastic optimal control. In particular, first we develop Lyapunov and converse Lyapunov theorems for stochastic semistable nonlinear dynamical systems. Semistability is the property whereby the solutions of a stochastic dynamical system almost surely converge to (not necessarily isolated) Lyapunov stable in probability equilibrium points determined by the system initial conditions. Then we develop a unified framework to address the problem of optimal nonlinear analysis and feedback control for nonlinear stochastic dynamical systems. Specifically, we provide a simplified and tutorial framework for stochastic optimal control and focus on connections between stochastic Lyapunov theory and stochastic Hamilton-Jacobi-Bellman theory. In particular, we show that asymptotic stability in probability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function which can clearly be seen to be the solution to the steady-state form of the stochastic Hamilton-Jacobi-Bellman equation, and hence, guaranteeing both stochastic stability and optimality. Moreover, extensions to stochastic finite-time and partial-state stability and optimal stabilization are also addressed. Finally, we extended the notion of dissipativity theory for deterministic dynamical systems to controlled Markov diffusion processes and show the utility of the general concept of dissipation for stochastic systems.
Showing items related by title, author, creator and subject.
Cordeiro, Helio de Miranda (Georgia Institute of Technology, 2008-12-16)A new approach was developed to determine the operational stability margin of a laboratory scale combustor. Applying modern and robust techniques and tools from Dynamical System Theory, the approach was based on three basic ...
L'afflitto, Andrea (Georgia Institute of Technology, 2015-04-03)Asymptotic stability is a key notion of system stability for controlled dynamical systems as it guarantees that the system trajectories are bounded in a neighborhood of a given isolated equilibrium point and converge to ...
Breen, Barbara J. (Georgia Institute of Technology, 2003-12-01)