• Corpus ID: 59334037

Estimation of entropy measures for categorical variables with spatial correlation

  title={Estimation of entropy measures for categorical variables with spatial correlation},
  author={Linda Altieri and Daniela Cocchi and Giulia Roli},
Entropy is a measure of heterogeneity widely used in applied sciences, where often spatial data are present. Recently, many approaches have been proposed to include spatial information in entropy, in order to synthesize the observed data in a single, interpretable number. In other studies the objective is, rather, entropy estimation; at this regard several proposals appear in the literature, which basically are corrections of the plug-in estimator, where proportions take the place of the… 

Figures from this paper


A new approach to spatial entropy measures
The proposed indices are used for measuring the spatial entropy of a marked point pattern on rainforest tree species and are shown to be more informative and to answer a wider set of questions than the current proposals of the literature.
Defining Spatial Entropy from Multivariate Distributions of Co-occurrences
The approach allows multivariate data with covariates to be accounted for, and provides the flexibility to design a wide range of spatial interaction models between the attributes, including adjacency properties or distances between and within categories.
Estimation of Entropy and Mutual Information
  • L. Paninski
  • Mathematics, Computer Science
    Neural Computation
  • 2003
An exact local expansion of the entropy function is used to prove almost sure consistency and central limit theorems for three of the most commonly used discretized information estimators, and leads to an estimator with some nice properties: the estimator comes equipped with rigorous bounds on the maximum error over all possible underlying probability distributions, and this maximum error turns out to be surprisingly small.
Local and global spatio-temporal entropy indices based on distance-ratios and co-occurrences distributions
The research developed in this paper introduces several new indices and explores their extensions to the spatio-temporal domains which are derived whilst investigating further their application as global and local indices.
Entropy in Spatial Aggregation
This paper introduces an approach to the measurement of locational phenomena in a spatial hierarchy using entropy statistics. A number of such statistics suitable for the study of spatial aggregation
Bayesian entropy estimation for countable discrete distributions
This work considers the problem of estimating Shannon's entropy H from discrete data, in cases where the number of possible symbols is unknown or even countably infinite, and derives a family of continuous measures for mixing Pitman-Yor processes to produce an approximately flat prior over H.
Convergence properties of functional estimates for discrete distributions
Suppose P is an arbitrary discrete distribution on acountable alphabet 𝒳. Given an i.i.d. sample (X1,…,Xn) drawnfrom P, we consider the problem of estimating the entropy H(P) or some other
Entropy Estimation in Turing's Perspective
A new nonparametric estimator of Shannon's entropy on a countable alphabet is proposed and analyzed against the well-known plug-in estimator, which has a bias decaying exponentially in n.
Entropy Inference and the James-Stein Estimator, with Application to Nonlinear Gene Association Networks
A James-Stein-type shrinkage estimator is developed, resulting in a procedure that outperforms eight other entropy estimation procedures across a diverse range of sampling scenarios and data-generating models, even in cases of severe undersampling.
Approximate Bayesian inference for latent Gaussian models by using integrated nested Laplace approximations
This work considers approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with non‐Gaussian response variables and can directly compute very accurate approximations to the posterior marginals.