"This book is a very clear and comprehensive exposition of several aspects of linear control theory connected with the stabilizability problems. The first chapter introduces the basic notions of stability and the problem of stabilization by means of feedback. The interest in the linear case is motivated by the theorem of stability for the first approximation. Chapter 2 is devoted to finite-dimensional, time-continuous, time-invariant linear systems. First, the authors discuss some classical concepts, like controllability, stabilizability, observability, detectability and their relationship. Moreover, optimality and stabilization are related by means of an interesting version of the Kalman-Lurie-Yakubovich-Popov equation. Finally, the authors consider state estimators and stabilization with disturbance attenuation. A similar theory is developed in Chapter 3, for systems with two time scales (singularly perturbed systems) that is systems with fast and slow components. Chapter 4 deals with high-gain stabilization of minimum phase systems, while Chapter 5 is concerned with adaptive stabilization and identification. In the final chapter, the authors study stabilization of systems where the feedback is implemented by means of a sampling technique (digital control)."