# Continuously Differentiable Sample-Spacing Entropy Estimation

@article{Ozertem2008ContinuouslyDS, title={Continuously Differentiable Sample-Spacing Entropy Estimation}, author={U. Ozertem and Ismail Uysal and Deniz Erdoğmuş}, journal={IEEE Transactions on Neural Networks}, year={2008}, volume={19}, pages={1978-1984} }

The insufficiency of using only second-order statistics and premise of exploiting higher order statistics of the data has been well understood, and more advanced objectives including higher order statistics, especially those stemming from information theory, such as error entropy minimization, are now being studied and applied in many contexts of machine learning and signal processing. In the adaptive system training context, the main drawback of utilizing output error entropy as compared to… Expand

#### 11 Citations

Δ-Entropy: Definition, properties and applications in system identification with quantized data

- Mathematics, Computer Science
- Inf. Sci.
- 2011

A new entropy definition for the discrete random variables, i.e. the @D-entropy, based on Riemann sums for finite size partitions is given, which is sensitive to the dynamic range of the data, and can be used as a superior optimality criterion in system identification problems. Expand

Insights Into the Robustness of Minimum Error Entropy Estimation

- Computer Science, Medicine
- IEEE Transactions on Neural Networks and Learning Systems
- 2018

For a one-parameter linear errors-in-variables (EIV) model and under some conditions, it is suggested that the MEE estimate can be very close to the true value of the unknown parameter even in presence of arbitrarily large outliers in both input and output variables. Expand

Estimating Differential Entropy using Recursive Copula Splitting

- Physics, Mathematics
- Entropy
- 2020

A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into a… Expand

Mean-Square Convergence Analysis of ADALINE Training With Minimum Error Entropy Criterion

- Mathematics, Computer Science
- IEEE Transactions on Neural Networks
- 2010

A unified approach is developed for mean-square convergence analysis for ADALINE training under MEE criterion, based on a block version of energy conservation relation and the weight update equation is formulated in the form of block-data. Expand

Stochastic gradient identification of Wiener system with maximum mutual information criterion

- Mathematics
- 2011

This study presents an information-theoretic approach for adaptive identification of an unknown Wiener system. A two-criterion identification scheme is proposed, in which the adaptive system… Expand

Sparse Approximation Through Boosting for Learning Large Scale Kernel Machines

- Mathematics, Computer Science
- IEEE Transactions on Neural Networks
- 2010

The proposed method, closely related to gradient boosting, could decrease the required number M of forward steps significantly and thus a large fraction of computational cost is saved. Expand

On the consistency of the Kozachenko-Leonenko entropy estimate

- Mathematics
- 2021

when this integral exists. The objective of this paper is to study an estimate of (1) based on independent and identically distributed samples X1; : : :Xn, with density f . Luc Devroye, School of… Expand

Bayesian fusion of empirical distributions based on local density reconstruction

- Mathematics, Computer Science
- 2015 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)
- 2015

A generalized multiplication procedure that mutually reweights appropriate points of one density by local density values of the other density, which is symmetric in the sense that it uses points from both densities. Expand

Dynamic feedforward network architecture design based on information entropy

- Computer Science
- 2016 13th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE)
- 2016

A dynamic feedforward neural network architecture design method based on information entropy is proposed that can improve the neural network's dynamic response ability and can solve the problem of self-organizing architecture design of the feed forward neural network. Expand

A new fault feature extraction method of rotating machinery based on finite sample function

- Computer Science
- 2020

The simulation results prove that the proposed BSS algorithm is able to separate mixed signals that contain both sub-Gaussian and super- Gaussian sources and has better separation performance when compared with other BSS ones. Expand

#### References

SHOWING 1-10 OF 25 REFERENCES

ICA Using Spacings Estimates of Entropy

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2003

A new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator that is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. Expand

Generalized information potential criterion for adaptive system training

- Mathematics, Medicine
- IEEE Trans. Neural Networks
- 2002

A generalization of the error entropy criterion that enables the use of any order of Renyi's entropy and any suitable kernel function in density estimation is proposed and shown that the proposed entropy estimator preserves the global minimum of actual entropy. Expand

Estimating mutual information.

- Mathematics, Medicine
- Physical review. E, Statistical, nonlinear, and soft matter physics
- 2004

Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented. Expand

An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems

- Mathematics, Computer Science
- IEEE Trans. Signal Process.
- 2002

It is shown that the global minimum of this nonparametric estimator for Renyi's entropy is the same as the actual entropy, and the performance of the error-entropy-minimization criterion is compared with mean-square-error- Minimization in the short-term prediction of a chaotic time series and in nonlinear system identification. Expand

An Algorithm for Data-Driven Bandwidth Selection

- Mathematics, Computer Science
- IEEE Trans. Pattern Anal. Mach. Intell.
- 2003

This paper develops a reliable algorithm which takes into account the stability of local bandwidth estimates across scales, and demonstrates that, within the large sample approximation, the local covariance is estimated by the matrix that maximizes the magnitude of the normalized mean shift vector. Expand

Nonparametric entropy estimation. An overview

- Mathematics
- 1997

We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and applied… Expand

Location Fingerprinting In A Decorrelated Space

- Computer Science
- IEEE Transactions on Knowledge and Data Engineering
- 2008

By projecting the measured signal into a decorrelated signal space, the positioning accuracy is improved, since the cross correlation between each AP is reduced, and experimental results show that the size of training samples can be greatly reduced in the decorrelated space. Expand

A Test for Normality Based on Sample Entropy

- Mathematics, Computer Science
- 1976

The test is shown to be a consistent test of the null hypothesis for all alternatives without a singular continuous part and compares favourably with other tests for normality. Expand

Elements of Information Theory

- Engineering, Computer Science
- 1991

The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment. Expand

Density Estimation for Statistics and Data Analysis.

- Computer Science
- 1988

The two main aims of the book are to explain how to estimate a density from a given data set and to explore how density estimates can be used, both in their own right and as an ingredient of other statistical procedures. Expand