Systems that evolve with time occur frequently in nature and modelling the behavior of such systems provides an important application of mathematics. These systems can be completely deterministic, but it may be possible too to control their behavior by intervention through "controls". The theory of optimal control is concerned with determining such controls which, at minimum cost, either direct the system along a given trajectory or enable it to reach a given point in its state space. This textbook is a straightforward introduction to the theory of optimal control with an emphasis on presenting many different applications. Professor Hocking has taken pains to ensure that the theory is developed to display the main themes of the arguments but without using sophisticated mathematical tools. Problems in this setting can arise across a wide range of subjects and there are illustrative examples of systems from fields as diverse as dynamics, economics, population control, and medicine. Throughout there are many worked examples, and numerous exercises (with solutions) are provided.
`The author has succeeded in creating a very self-contained book which introduces the reader to both the Pontryagin maximum principle and a wide range of applications.' Short Book Reviews
'a thoroughly readable working text for either postgraduate or undergraduate study ... The important concepts are introduced and illustrated by simple examples and there are many lively exercises, with appended solutions.'
Mathematika, 38 (1991)
'The book gives an introductory text for those approaching the topic for the first time and without sophisticated mathematical analysis at their disposal ... the book is clearly written and the subject is well presented. The structure of the book has been carefully selected. The exercises will help in a deeper study of this subject.'
Andrzej J. Osiadacz, Warsaw University of Technology, International Journal of Adaptive Control and Signal Processing, Vol. 7 (1993)
Optimal control problems; Systems of differential equations, matrices, and sets; Part A Time-Optimal Control of Linear Systems: Controllability; Time-optimal control; Further examples; Part B The Pontryagin Maximum Principle: The basic Pontryagin Maximum Principle (PMP); Extensions to the PMP; Linear state equations with quadratic costs; Proof of the Pontryagin Maximum Principle; Further applications and extensions; Part C Applications of Optimal Control
Theory: Some applied optimal control problems; Numerical methods for optimal control problems; Bibliography; Outline solutions to the exercises; Index