• Publications
  • Influence
Correntropy: Properties and Applications in Non-Gaussian Signal Processing
TLDR
This paper elucidate further the probabilistic and geometric meaning of the recently defined correntropy function as a localized similarity measure and a close relationship between corrent entropy and M-estimation is established.
Kernel Adaptive Filtering: A Comprehensive Introduction
TLDR
Kernel Adaptive Filtering is the first book to present a comprehensive, unifying introduction to online learning algorithms in reproducing kernel Hilbert spaces and addresses the principal bottleneck of kernel adaptive filterstheir growing structure.
The Kernel Least-Mean-Square Algorithm
TLDR
It is shown that with finite data the KLMS algorithm can be readily used in high dimensional spaces and particularly in RKHS to derive nonlinear, stable algorithms with comparable performance to batch, regularized solutions.
Extended Kernel Recursive Least Squares Algorithm
TLDR
A kernelized version of the extended recursive least squares (EX-KRLS) algorithm which implements for the first time a general linear state model in reproducing kernel Hilbert spaces (RKHS) which only requires inner product operations between input vectors, thus enabling the application of the kernel property.
An Information Theoretic Approach of Designing Sparse Kernel Adaptive Filters
TLDR
A systematic sparsification scheme is proposed, which can drastically reduce the time and space complexity without harming the performance of kernel adaptive filters.
Fixed-budget kernel recursive least-squares
TLDR
A kernel-based recursive least-squares algorithm on a fixed memory budget, capable of recursively learning a nonlinear mapping and tracking changes over time, that obtains better performance than state-of-the-art kernel adaptive filtering techniques given similar memory requirements.
Kernel Affine Projection Algorithms
TLDR
KAPA inherits the simplicity and online nature of KLMS while reducing its gradient noise, boosting performance and provides a unifying model for several neural network techniques, including kernel least-mean-square algorithms, kernel adaline, sliding-window kernel recursive-least squares, and regularization networks.
Correntropy: A Localized Similarity Measure
TLDR
The probabilistic meaning of correntropy is revealed as a new localized similarity measure based on information theoretic learning (ITL) and kernel methods that can be very useful in nonlinear, non-Gaussian signal processing.
...
1
2
3
4
...