Nonparametric output prediction for nonlinear fading memory systems

@article{Kulkarni1999NonparametricOP,
  title={Nonparametric output prediction for nonlinear fading memory systems},
  author={Sanjeev R. Kulkarni and S. E. Posner},
  journal={IEEE Trans. Autom. Control.},
  year={1999},
  volume={44},
  pages={29-37}
}
The authors construct a class of elementary nonparametric output predictors of an unknown discrete-time nonlinear fading memory system. Their algorithms predict asymptotically well for every bounded input sequence, every disturbance sequence in certain classes, and every linear or nonlinear system that is continuous and asymptotically time-invariant, causal, and with fading memory. The predictor is based on k/sub n/-nearest neighbor estimators from nonparametric statistics. It uses only… 

Universal output prediction and nonparametric regression for arbitrary data

We construct a class of elementary nonparametric output predictors of an unknown discrete-time nonlinear fading memory system. Our algorithms predict asymptotically well for every bounded input

Nonparametric control algorithms for nonlinear fading memory systems

  • S. SandilyaS. Kulkarni
  • Computer Science, Mathematics
    Proceedings of the 38th IEEE Conference on Decision and Control (Cat. No.99CH36304)
  • 1999
An algorithm is developed to control an unknown nonlinear fading memory discrete-time system based on nonparametric regression techniques rather than traditional feedback control and produces an output that converges to the desired output when stricter conditions are imposed on the system.

A Note on the Connection Between Incremental Input-to-State Stability and Fading Memory in Nonlinear Systems

Recently, Angeli1 has proposed the concept of incremental input-to-state stability, which can be viewed as an extension of the well-known input-to-state stability to the context of incremental

Some contributions to fixed-distribution learning theory

Two new notions of learnability are introduced; these are probably uniformly approximately correct (PUAC) learnability which is a stronger requirement than the widely studied PAC learnability, and minimal empirical risk (MER) learnable, which is an stronger requirementthan the previously defined notions of "solid" or "potential" learnability.

Opacity verification in stochastic discrete event systems

The notion of step-based almost current-state opacity is introduced which provides a measure of opacity for a given system and a verification method is proposed for this probabilistic notion of opacity and its computational complexity is proposed.

Opacity formulations and verification in discrete event systems

This paper offers a review of various definitions of opacity, along with methodologies for their verification and complexity analysis, in state-based notions of opacity in non-deterministic finite automata, as well as their extensions to stochastic settings.

Robustness analysis for identification and control of nonlinear systems

A bias removal algorithm based on techniques from the instrumental-variables literature is proposed and a family of tighter convex upper bounds for simulation error which naturally lead to an iterative identification scheme is introduced.

References

SHOWING 1-10 OF 28 REFERENCES

On estimation of a class of nonlinear systems by the kernel regression estimate

The estimation of a multiple-input single-output discrete Hammerstein system that contains a nonlinear memoryless subsystem followed by a dynamic linear subsystem is studied, and the distribution-free pointwise and global convergence of the estimate is demonstrated.

Nonparametric identification of Wiener systems by orthogonal series

A Wiener system, i.e., a system comprising a linear dynamic and a nonlinear memoryless subsystems connected in a cascade, is identified and an algorithm to identify the impulse response of the linear subsystem is proposed.

Dynamic system identification with order statistics

Systems consisting of linear dynamic and memory-less nonlinear subsystems are identified to recover the nonlinearity from noisy input-output observations of the whole system; signals interconnecting the elements are not measured.

Worst-Case Identification of Nonlinear Fading Memory Systems

The sample complexity of worst-case identification of FIR linear systems

The problem of identification of linear systems in the presence of measurement noise which is unknown but bounded in magnitude by some /spl delta/>0.0 is considered and the minimal length of an identification experiment that is guaranteed to lead to a diameter bounded by 2K/spl delta/ behaves like 2/sup Nf(1/K)/, when N is large.

Control oriented system identification: a worst-case/deterministic approach in H/sub infinity /

The authors formulate and solve two related control-oriented system identification problems for stable linear shift-invariant distributed parameter plants, each involving identification of a point sample of the plant frequency response from a noisy, finite, output time series obtained in response to an applied sinusoidal input.

Rates of convergence of nearest neighbor estimation under arbitrary sampling

Rates of convergence for nearest neighbor estimation are established in a general framework in terms of metric covering numbers of the underlying space and a consistency result is established for k/sub n/-nearest neighbor estimation under arbitrary sampling and a convergence rate matching established rates for i.i.d. sampling is established.

Universal prediction of individual sequences

The authors define the finite state predictability of the (infinite) sequence x/sub 1/ . . . z/sub n/ .

On the time complexity of worst-case system identification

This paper determines bounds on the minimum duration identification experiment that must be run to identify the plant to within a specified guaranteed worst-case error bound.

Necessary and sufficient conditions for the pointwise convergence of nearest neighbor regression function estimates

where (v,1, ..., v,,) is a given probability vector, and (Xt(x), Yl(x)), ..., (X.(x), Y,(x)) is a permutation of (X1, I71) . . . . , (X, , Y,) according to increasing values of IlXi-x[I, x e R a.