A Lyapunov-Like Characterization of Asymptotic Controllability

@article{Sontag1983ALC,
  title={A Lyapunov-Like Characterization of Asymptotic Controllability},
  author={Eduardo Sontag},
  journal={SIAM Journal on Control and Optimization},
  year={1983},
  volume={21},
  pages={462-471}
}
  • Eduardo Sontag
  • Published 1 May 1983
  • Mathematics
  • SIAM Journal on Control and Optimization
It is shown that a control system in ${\bf R}^n $ is asymptotically controllable to the origin if and only if there exists a positive definite continuous functional of the states whose derivative can be made negative by appropriate choices of controls. 

Asymptotic Controllability And Feedback Stabilization

It is shown that every asymptotically controllable system can be stabilized by means of some (discontinuous) feedback law. One of the contributions of the paper is in defining precisely the meaning

Continuous control-Lyapunov functions for asymptotically controllable time-varying systems

This paper shows that, for time varying systems, global asymptotic controllability to a given closed subset of the state space is equivalent to the existence of a continuous control-Lyapunov function

Results on discrete-time control-Lyapunov functions

  • C. KellettA. Teel
  • Mathematics
    42nd IEEE International Conference on Decision and Control (IEEE Cat. No.03CH37475)
  • 2003
We demonstrate the existence of a smooth control-Lyapunov function (CLF) for difference equations asymptotically controllable to closed sets. This follows from a more general result on the existence

External stabilization of discontinuous systems and nonsmooth control Lyapunov-like functions.

The main result of this note is an external stabilizability theorem for discontinuous systems affine in the control (with solutions intended in the Filippov’s sense). In order to get it we first

A general notion of global asymptotic controllability for time-varying systems and its Lyapunov characterization

A general notion of global asymptotic controllability to a given equilibrium of a time-varying system is introduced and is shown that this property is equivalent to the existence of a lower

Control-Lyapunov Universal Formulas for Restricted Inputs

We deal with the question of obtaining explicit feedback control laws that stabilize a nonlinear system, under the assumption that a \control Lyapunov function" is known. In previous work, the case

Asymptotic controllability implies feedback stabilization

It is shown that every asymptotically controllable system can be globally stabilized by means of some (discontinuous) feedback law. The stabilizing strategy is based on pointwise optimization of a

Asymptotic ControllabilityImplies Feedback

| It is shown that every asymptotically controllable system can be globally stabilized by means of some (discontinuous) feedback law. The stabilizing strategy is based on pointwise optimization of a
...

References

SHOWING 1-10 OF 30 REFERENCES

A Characterization of the Reachable Set for Nonlinear Control Systems

The question of whether a set is reachable by a nonlinear control system is answered in terms of the properties of a convex optimization problem. The set is reachable or not according to whether the

Remarks on continuous feedback

  • Eduardo SontagH. Sussmann
  • Mathematics
    1980 19th IEEE Conference on Decision and Control including the Symposium on Adaptive Processes
  • 1980
We show that, in general, it is impossible to stabilize a controllable system by means of a continuous feedback, even if memory is allowed. No optimality considerations are involved. All state spaces

Stability of nonlinear control systems

Text on stability of control system as developed from direct method of Liapunov, noting V. M. Popov contribution

Relaxed Controls and the Dynamics of Control Systems

The relationship between relaxed controls and the family of processes or flows generated by ordinary controls is studied. We find that the flows generated by the relaxed controls form a completion of

Principles of optimal control theory

This monograph is intended for use in a one-semester graduate course or advanced undergraduate course and contains the principles of general control theory and proofs of the maximum principle and basic existence theorems of optimal control theory.

Stability Theory by Liapunov's Direct Method

I. Elements of Stability Theory.- 1. A First Glance at Stability Concepts.- 2. Various Definitions of Stability and Attractivity.- 3. Auxiliary Functions.- 4. Stability and Partial Stability.- 5.

Stability of Motion