Estimation of mutual information by the fuzzy histogram

@article{Haeri2014EstimationOM,
  title={Estimation of mutual information by the fuzzy histogram},
  author={Maryam Amir Haeri and Mohammad Mehdi Ebadzadeh},
  journal={Fuzzy Optimization and Decision Making},
  year={2014},
  volume={13},
  pages={287-318}
}
  • M. HaeriM. Ebadzadeh
  • Published 1 September 2014
  • Computer Science
  • Fuzzy Optimization and Decision Making
Mutual Information (MI) is an important dependency measure between random variables, due to its tight connection with information theory. It has numerous applications, both in theory and practice. However, when employed in practice, it is often necessary to estimate the MI from available data. There are several methods to approximate the MI, but arguably one of the simplest and most widespread techniques is the histogram-based approach. This paper suggests the use of fuzzy partitioning for the… 

IC-FNN: A Novel Fuzzy Neural Network With Interpretable, Intuitive, and Correlated-Contours Fuzzy Rules for Function Approximation

A novel fuzzy neural network with intuitive, interpretable, and correlated-contours fuzzy rules (IC-FNN), for function approximation, is presented and could construct more parsimonious structures with higher accuracy, in comparison to the existing methods.

A robust estimator of mutual information for deep learning interpretability

GMM-MI (pronounced “Jimmie”), an algorithm based on Gaussian mixture models that can be applied to both discrete and continuous settings that quantifies both the level of disentanglement between the latent variables, and their association with relevant physical quantities, thus unlocking the interpretability of the latent representation.

NONLINEAR FUNCTION APPROXIMATION

The proposed model, based on a general Gaussian model able to construct different shapes, is successfully applied to real-world time-series predictions, regression problems, and nonlinear system identification and outperforms other methods with a more parsimonious structure.

Independent component analysis approach using higher orders of Non-Gaussianity

A novel algorithm has been proposed to handle cases with unknown probability distributions led to the vagueness of nonlinearity as a priori and manifested the highest accuracy in test data over other well-known BSS methods.

Semantic schema theory for genetic programming

This study proposes a schema theory which could be a more realistic model for GP and could be potentially employed for improving GP in practice, and introduces the concept of semantic schema, which partitions the search space according to semantics of trees, regardless of their syntactic variety.

An improved semantic schema modeling for genetic programming

An improved approach for developing the semantic schema in genetic programming by incorporating semantic awareness in schema theory is presented and new insight is provided on the relation between syntactic and semantic spaces.

Semantic schema modeling for genetic programming using clustering of building blocks

This paper first defines the notion of semantics for a tree based on the mutual information between its output vector and the target and introduces semantic building blocks to facilitate the modeling of semantic schema, and proposes information based clustering to cluster the building blocks.

Incipient Fault Detection for Chemical Processes Using Two-Dimensional Weighted SLKPCA

An enhanced SLKPCA method, referred to as two-dimensional weighted SLK PCs, is proposed by integrating the sample and component weighting strategies, which puts large weights on the samples with strong fault information.

References

SHOWING 1-10 OF 19 REFERENCES

Fuzzy Histograms and Density Estimation

Histogram is the oldest and most widely used density estimator for presentation and exploration of observed univariate data and is said to be uniform or regular.

Estimating mutual information.

Two classes of improved estimators for mutual information M(X,Y), from samples of random points distributed according to some joint probability density mu(x,y), based on entropy estimates from k -nearest neighbor distances are presented.

The mutual information: Detecting and evaluating dependencies between variables

The findings show that the algorithms used so far may be quite substantially improved upon when dealing with small datasets, finite sample effects and other sources of potentially misleading results have to be taken into account.

Canonical dependency analysis based on squared-loss mutual information

On estimation of entropy and mutual information of continuous distributions

A nonlinear correlation measure for multivariable data set

Histogram density estimators based upon a fuzzy partition

Estimation of the Information by an Adaptive Partitioning of the Observation Space

We demonstrate that it is possible to approximate the mutual information arbitrarily closely in probability by calculating the relative frequencies on appropriate partitions and achieving conditional

Justification and numerical realization of the uniform method for finding point estimates of interval elicited scaling constants

An analytical justification and a numerical realization of the uniform method that finds point estimates of interval scaling constants that are uniformly distributed in their uncertainty intervals are proposed.