Lehrstuhl für Mechatronik in Maschinenbau und Fahrzeugtechnik (MEC)

Optimal Stochastic Control

Description

Stochastic control theory constitutes a rapidly growing research area within Stochastic Analysis which has many applications in e.g. biology, physics, engineering, finance or economics.

Stochastic differential equations (SDEs) are a useful tool to model diverse random processes like e.g. the time evolution of stock prices, outdoor temperature, precipitation amounts, infection numbers, or the population size of tumor cells.

In order to solve stochastic optimization problems, two main approaches can be applied:

  • dynamic programming and the Hamilton-Jacobi-Bellman (HJB) theory, or
  • stochastic maximum principles.      

Both methods rely on a controlled stochastic differential equation approach to model the underlying state process, and there is an objective functional which shall be maximized/minimized with respect to the control process.

The optimal control solving the optimization problem usually is a stochastic process itself.

The original version of the stochastic maximum principle can be found in Øksendal & Sulem (2019), while there have been published many extensions of the methodology to stochastic setups including e.g. time delay, partial or anticipative information, or forward-backward SDEs.
 

Goals

  • to derive new theoretical results in the area of stochastic optimal control theory.
  • to use these results in diverse practical applications.
     

References

Optimal equivalent probability measures under enlarged filtrations
Journal of Optimization Theory and Applications (JOTA) 183, 2019, Pages 813-839. DOI
M. Hess

An anticipative stochastic minimum principle under enlarged filtrations
Stochastic Analysis and Applications 39(2), 2020, Pages 252-277. DOI
M. Hess

Applied stochastic control of jump diffusions
Springer Nature Switzerland AG, Cham, 3rd edition, 2019. DOI
B. Øksendal, A. Sulem

A maximum principle for stochastic control with partial information
Stochastic Analysis and Applications 25(3), 2007, Pages 705-715. DOI
F. Baghery, B. Øksendal

Stochastic controls: Hamiltonian systems and HJB equations
Springer Science & Business Media 43, 1999. DOI
J. Yong, X. Zhou

 

Keywords

  • Stochastic Control Problem
  • Stochastic Maximum Principle
  • Dynamic Programming
  • Hamilton-Jacobi-Bellman Theory
  • Stochastic Differential Equation
     

Contact

Dr. Markus Hess
Gottlieb-Daimler-Str. 42
67663, Kaiserslautern
Phone: +49 (0)631/205-3706
Fax: +49 (0)631/205-4201
markus.hess(at)mv.rptu.de

Dr. Sandesh Hiremath
Gottlieb-Daimler-Str. 42
67663, Kaiserslautern
Phone: +49 (0)631/205-3455
Fax: +49 (0)631/205-4201
sandesh.hiremath(at)mv.uni-kl.de

 

Funding

State of Rhineland-Palatinate

Time span

Since 2021
 

Zum Seitenanfang