Continuously Differentiable Sample-Spacing Entropy Estimation

@article{Ozertem2008ContinuouslyDS,
  title={Continuously Differentiable Sample-Spacing Entropy Estimation},
  author={U. Ozertem and Ismail Uysal and Deniz Erdoğmuş},
  journal={IEEE Transactions on Neural Networks},
  year={2008},
  volume={19},
  pages={1978-1984}
}
The insufficiency of using only second-order statistics and premise of exploiting higher order statistics of the data has been well understood, and more advanced objectives including higher order statistics, especially those stemming from information theory, such as error entropy minimization, are now being studied and applied in many contexts of machine learning and signal processing. In the adaptive system training context, the main drawback of utilizing output error entropy as compared to… Expand
Δ-Entropy: Definition, properties and applications in system identification with quantized data
TLDR
A new entropy definition for the discrete random variables, i.e. the @D-entropy, based on Riemann sums for finite size partitions is given, which is sensitive to the dynamic range of the data, and can be used as a superior optimality criterion in system identification problems. Expand
Insights Into the Robustness of Minimum Error Entropy Estimation
TLDR
For a one-parameter linear errors-in-variables (EIV) model and under some conditions, it is suggested that the MEE estimate can be very close to the true value of the unknown parameter even in presence of arbitrarily large outliers in both input and output variables. Expand
Estimating Differential Entropy using Recursive Copula Splitting
A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into aExpand
Mean-Square Convergence Analysis of ADALINE Training With Minimum Error Entropy Criterion
TLDR
A unified approach is developed for mean-square convergence analysis for ADALINE training under MEE criterion, based on a block version of energy conservation relation and the weight update equation is formulated in the form of block-data. Expand
Stochastic gradient identification of Wiener system with maximum mutual information criterion
This study presents an information-theoretic approach for adaptive identification of an unknown Wiener system. A two-criterion identification scheme is proposed, in which the adaptive systemExpand
Sparse Approximation Through Boosting for Learning Large Scale Kernel Machines
  • P. Sun, X. Yao
  • Mathematics, Computer Science
  • IEEE Transactions on Neural Networks
  • 2010
TLDR
The proposed method, closely related to gradient boosting, could decrease the required number M of forward steps significantly and thus a large fraction of computational cost is saved. Expand
On the consistency of the Kozachenko-Leonenko entropy estimate
when this integral exists. The objective of this paper is to study an estimate of (1) based on independent and identically distributed samples X1; : : :Xn, with density f . Luc Devroye, School ofExpand
Bayesian fusion of empirical distributions based on local density reconstruction
  • U. Hanebeck
  • Mathematics, Computer Science
  • 2015 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)
  • 2015
TLDR
A generalized multiplication procedure that mutually reweights appropriate points of one density by local density values of the other density, which is symmetric in the sense that it uses points from both densities. Expand
Dynamic feedforward network architecture design based on information entropy
  • Xiaoou Li, Zhaozhao Zhang, Wen Yu
  • Computer Science
  • 2016 13th International Conference on Electrical Engineering, Computing Science and Automatic Control (CCE)
  • 2016
TLDR
A dynamic feedforward neural network architecture design method based on information entropy is proposed that can improve the neural network's dynamic response ability and can solve the problem of self-organizing architecture design of the feed forward neural network. Expand
A new fault feature extraction method of rotating machinery based on finite sample function
TLDR
The simulation results prove that the proposed BSS algorithm is able to separate mixed signals that contain both sub-Gaussian and super- Gaussian sources and has better separation performance when compared with other BSS ones. Expand
...
1
2
...

References

SHOWING 1-10 OF 25 REFERENCES
ICA Using Spacings Estimates of Entropy
TLDR
A new algorithm for the independent components analysis (ICA) problem based on an efficient entropy estimator that is simple, computationally efficient, intuitively appealing, and outperforms other well known algorithms. Expand
Generalized information potential criterion for adaptive system training
TLDR
A generalization of the error entropy criterion that enables the use of any order of Renyi's entropy and any suitable kernel function in density estimation is proposed and shown that the proposed entropy estimator preserves the global minimum of actual entropy. Expand
Estimating mutual information.
TLDR
Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented. Expand
An error-entropy minimization algorithm for supervised training of nonlinear adaptive systems
TLDR
It is shown that the global minimum of this nonparametric estimator for Renyi's entropy is the same as the actual entropy, and the performance of the error-entropy-minimization criterion is compared with mean-square-error- Minimization in the short-term prediction of a chaotic time series and in nonlinear system identification. Expand
An Algorithm for Data-Driven Bandwidth Selection
  • D. Comaniciu
  • Mathematics, Computer Science
  • IEEE Trans. Pattern Anal. Mach. Intell.
  • 2003
TLDR
This paper develops a reliable algorithm which takes into account the stability of local bandwidth estimates across scales, and demonstrates that, within the large sample approximation, the local covariance is estimated by the matrix that maximizes the magnitude of the normalized mean shift vector. Expand
Nonparametric entropy estimation. An overview
We assume that H(f) is well-defined and is finite. The concept of differential entropy was introduced in Shannon’s original paper ([55]). Since then, entropy has been of great theoretical and appliedExpand
Location Fingerprinting In A Decorrelated Space
TLDR
By projecting the measured signal into a decorrelated signal space, the positioning accuracy is improved, since the cross correlation between each AP is reduced, and experimental results show that the size of training samples can be greatly reduced in the decorrelated space. Expand
A Test for Normality Based on Sample Entropy
TLDR
The test is shown to be a consistent test of the null hypothesis for all alternatives without a singular continuous part and compares favourably with other tests for normality. Expand
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment. Expand
Density Estimation for Statistics and Data Analysis.
TLDR
The two main aims of the book are to explain how to estimate a density from a given data set and to explore how density estimates can be used, both in their own right and as an ingredient of other statistical procedures. Expand
...
1
2
3
...