• Publications
  • Influence
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
This paper elucidate further the probabilistic and geometric meaning of the recently defined correntropy function as a localized similarity measure and a close relationship between corrent entropy and M-estimation is established. Expand
Information Theoretic Learning - Renyi's Entropy and Kernel Perspectives
  • J. Príncipe
  • Computer Science, Mathematics
  • Information Theoretic Learning
  • 6 April 2010
Students, practitioners and researchers interested in statistical signal processing, computational intelligence, and machine learning will find the theory to understand the basics, the algorithms to implement applications, and exciting but still unexplored leads that will provide fertile ground for future research in this book. Expand
Kernel Adaptive Filtering: A Comprehensive Introduction
Kernel Adaptive Filtering is the first book to present a comprehensive, unifying introduction to online learning algorithms in reproducing kernel Hilbert spaces and addresses the principal bottleneck of kernel adaptive filterstheir growing structure. Expand
Support vector machines for SAR automatic target recognition
Experimental results showed that SVMs outperform conventional classifiers in target classification because SVMs with the Gaussian kernels are able to form a local "bounded" decision region around each class that presents better rejection to confusers. Expand
Information Theoretic Learning
This work states that there is information in the error signal that is not captured during the training of nonlinear adaptive systems under non-Gaussian distribution conditions when one insists on secondorder statistical criteria. Expand
Generalized correlation function: definition, properties, and application to blind equalization
A new generalized correlation measure is developed that includes the information of both the distribution and that of the time structure of a stochastic process. Expand
The Kernel Least-Mean-Square Algorithm
It is shown that with finite data the KLMS algorithm can be readily used in high dimensional spaces and particularly in RKHS to derive nonlinear, stable algorithms with comparable performance to batch, regularized solutions. Expand
Neural and adaptive systems : fundamentals through simulations
Data Fitting with Linear Models, Designing and Training MLPs, and Function Approximation withMLPs, Radial Basis Functions, and Support Vector Machines. Expand
Quantized Kernel Least Mean Square Algorithm
A quantized kernel least mean square (QKLMS) algorithm is developed, which is based on a simple online vector quantization method, and a lower and upper bound on the theoretical value of the steady-state excess mean square error is established. Expand
Generalized Correntropy for Robust Adaptive Filtering
A generalized correntropy that adopts the generalized Gaussian density (GGD) function as the kernel, and some important properties are presented, and an adaptive algorithm is derived and shown to be very stable and can achieve zero probability of divergence (POD). Expand