image of UIUC logo

ECE 553 - Optimal Control

Spring 2008, ECE, University of Illinois at Urbana-Champaign

"Since the building of the universe is perfect and is created   
by the wisdom creator, nothing arises in the universe in which 
one cannot see the sense of some maximum or minimum."          

                                                 ---- L. Euler

[ Announcements | Administrative | Homeworks | Handouts | Resources]


Homeworks & Solutions

Note: If you found errors in problem & solution sets, please contact me.


Useful Resources

Administrative Information

Instructor: Professor Yi Ma
Lectures: Monday & Wendesday 12:30am-1:50am, 241 Everitt Lab
Office hours: Tu 1:30pm-3:00pm, 145 Coordinated Science Lab
Office: 145 CSL, Phone: 244-0871
After hour appointments: through email.

Course Discription:
This is a second-year graduate course on control theory and systems. Prerequisites include an introductory graduate level control course which is equivalent to ECE 515, and some background in probability at the level of ECE413 (or better ECE534). In addition, you should be comfortable with basic notions in set theory (unions, intersections, sequences, etc.), signal and systems (matrix manipulations, linear differential or difference equations, calculus).

This course focuses on the theoretical and algorithmic foundations of optimal control theory, and will deal with primarily deterministic dynamical systems described in continuous time. Some aspects of stochastic and nonlinear constrol systems will be covered. If time allows, extensions of the single-criterion dynamic optimiztion theory to game-theory based approaches as well as H-infinity optimal control will be discussed in the end.
Recommended Texts:
*Dimitri P. Bertsekas, Dynamic Programming and Optimal Control, Volume I, 3rd edition, Athena Scientific, 2005.
*M. Athans and P.L. Falb, Optimal Control, McGraw Hill, 2007 (paper back).
*I. M. Gel'fand and S. V. Fomin, Calculus of Variations, Dover Publications, 2000.
Reserved References (in the engineering library):
*S.P. Sethi and G.L. Thompson, Optimal Control Theory, 2nd edition, Kluwer Academic Publishers, 2000.
*B.D.O. Anderson and J.B. Moore, Optimal Control: Linear Quadratic Methods, Prentice Hall, 1990.
*Andrew P. Sage and Chelsea C. White, Optimum systems control second edition , Prentice Hall, 1977. (out of print but reserved in the Engineering library and will hand out some chapters in hardcopies).
*Tamer Basar and Pierre Bernhard, H-infinity Optimal Control and Related Minimax Design Problems: A Dynamical Game Approach, second edition, Birkhauser, 1995. (Available at the bookstore).
*A.E. Bryson and Y.C. Ho, Applied Optimal Control, 2nd ed., Blaisdel, 1975.
*L.B. Lee and L. Markus, Foundations of Optimal Control Theory, Wiley, 1967.
*F.L. Lewis and V.L. Syrmos, Optimal Control, Wiley, 1995.
*P.V. Kokotovic and H.K. Khalil and J. O'Reilly, Singular Perturbation Methods in Control, Academic Press, 1986.
*T. Basar and G.J. Olsder, Dynamic Noncooperative Game Theory, SIAM Classics in Applied Mathematics, 1999.
There are plenty of books on optimal control and calculus of variations being reserved in the engineering library which are useful. I recommend you to find one which fits your background and taste. The process of finding the best reference (for yourself) is the most rewarding.
Grading Policy: Homework (25%), Midterm (25%) and Final Exam (50%). Grading will be based on the curve (so do not panic if you get low scores on your tests - others probably did worse than you).
*Homework (25%): You are allowed to discuss on the homework in small groups (of 2-3 persons), but you must write the solution independently to hand in. No late homework will be accepted (unless an extension is granted by the instructor to the whole class).
*Midterm Exam (25%), Time: Tuesday 7-9pm, March 11th. Place: TBA.
*Final Exam (50%), Time: Thursday 8-11am, May 8th; and Place: EL 241. If you are among the top 10% in the final, you will get an "A" no matter how badly you have flunked your midterm or homeworks. But if you want to get an "A+", you need to do well in all three categories.
Course Outline:
I. Introduction
1. Formulation of optimal control problems
2. Parameter optimization versus path optimization
3. Local and global optima; general conditions on existence and uniquenes.
4. Some basic facts from finite-dimensional optimization.
II. The Calculus of Variations
1. The Euler-Lagrange equation
2. Path optimization subject to constraints
3. Weak and strong extrema
III. The Minimum (Maximum) Principle and the Hamilton-Jacobi Theory
1. Pontryagin's minimum principle
2. Optimal control with state and control constraints
3. Time-optimal control
4. Singular solutions
5. Hamilton-Jacobi-Bellman (HJB) equation, and dynamical programming
6. Viscosity solutions to HJB
IV. Linear Quadratic Gaussian (LQG) Problems
1. Finite-time and infinite-time state (or output) regulators
2. Riccati equation and its properties
3. Tracking and disturbance rejection
4. Kalman filter and duality
5. The LQG design
V. Nonholonomic System Optimal Control

VI. Game Theoretic Optimal Control Design

Yi Ma |
Last updated 01/14/08