This volume focuses on various aspects of stabilization of linear systems, in particular those arising in mathematical and physical applications that are found in many areas of research.
"This book is a very clear and comprehensive exposition of several aspects of linear control theory connected with the stabilizability problems. The first chapter introduces the basic notions of stability and the problem of stabilization by means of feedback. The interest in the linear case is motivated by the theorem of stability for the first approximation. Chapter 2 is devoted to finite-dimensional, time-continuous, time-invariant linear systems. First, the authors discuss some classical concepts, like controllability, stabilizability, observability, detectability and their relationship. Moreover, optimality and stabilization are related by means of an interesting version of the Kalman-Lurie-Yakubovich-Popov equation. Finally, the authors consider state estimators and stabilization with disturbance attenuation. A similar theory is developed in Chapter 3, for systems with two time scales (singularly perturbed systems) that is systems with fast and slow components. Chapter 4 deals with high-gain stabilization of minimum phase systems, while Chapter 5 is concerned with adaptive stabilization and identification. In the final chapter, the authors study stabilization of systems where the feedback is implemented by means of a sampling technique (digital control)."
|Notations and Terminology|
|Stabilization of Linear Systems||p. 19|
|Stabilization of Linear Systems with Two Time Scales||p. 91|
|High-Gain Feedback Stabilization of Linear Systems||p. 133|
|Adaptive Stabilization and Identification||p. 201|
|Discrete Implementation of Stabilization Procedures||p. 243|
|Table of Contents provided by Blackwell. All Rights Reserved.|
Series: Systems and Control
Number Of Pages: 308
Publisher: BIRKHAUSER BOSTON INC
Country of Publication: US
Dimensions (cm): 23.5 x 15.88 x 1.91
Weight (kg): 0.59