Principal Information Theoretic Approaches

@article{Soofi2000PrincipalIT,
  title={Principal Information Theoretic Approaches},
  author={Ehsan S. Soofi},
  journal={Journal of the American Statistical Association},
  year={2000},
  volume={95},
  pages={1349 - 1353}
}
  • E. Soofi
  • Published 1 December 2000
  • Mathematics
  • Journal of the American Statistical Association
Kailath, J. (1980), Linear Systems, New York Wiley. Kalman, R. (1960), “A New Approach to Linear Filtering and Prediction Problems,” Journal of Basic Engineering, 82, 3545. Kay, S. (1988), Modern Spectral Estimutioh, Englewood Cliffs, NJ: Prentice-Hall. Kitagawa, G. (1993), “A Monte Car10 Filtering and Smoothing Method for Non-Gaussian Nonlinear State-Space Models,” in Proceedings of the Second U.S.-Japan Joint Seminar on Time Series, pp. 110-131. Institute of Statistical Mathematics. Kitagawa… 
Some Recent Developments in Econometric Inference
Abstract Recent results in information theory, see Soofi (1996; 2001) for a review, include derivations of optimal information processing rules, including Bayes' theorem, for learning from data based
Remarks on a ‘critique’ of the Bayesian Method of Moments
The Bayesian Method of Moments (BMOM) approach was put forward in Zellner (1994) and subsequently developed further and applied in Zellner et al. (1995) and Zellner (1997a), Currie (1996), Green &
Robust Feature Selection by Mutual Information Distributions
TLDR
A fast, newly defined method is shown to outperform the traditional approach based on empirical mutual information on a number of real data sets and a theoretical development is reported that allows one to efficiently extend the above methods to incomplete samples in an easy and effective way.
Tests of fit for the Laplace distribution based on correcting moments of entropy estimators
ABSTRACT In this paper, we first consider the entropy estimators introduced by Vasicek [A test for normality based on sample entropy. J R Statist Soc, Ser B. 1976;38:54–59], Ebrahimi et al. [Two
Neyman's smooth test and its use in econometrics
The following essay is a reappraisal of the role of the smooth test proposed by Neyman (1937) in the context of current applications in econometrics. We revisit the derivation of the smooth test and
Maximum Entropy Distributions: Bit Complexity and Stability
TLDR
The results put max-entropy distributions on a mathematically sound footing -- these distributions are robust and computationally feasible models for data.
The Burr XII power series distributions: A new compounding family
Generalizing lifetime distributions is always precious for applied statisticians. In this paper, we introduce a new family of distributions by compounding the Burr XII and power series distributions.
Automatic differentiation algorithms in model analysis
TLDR
This thesis presents a C++ library for the analysis of nonlinear models that can be represented by differentiable functions in which the methods for parameter estimation, statistical inference, model selection and sensitivity analysis are implemented.
Computing Maximum Entropy Distributions Everywhere
TLDR
The results imply that certain recent continuous optimization formulations, for instance, for discrete counting and optimization problems, the matrix scaling problem, and the worst case Brascamp-Lieb constants in the rank-1 regime, are efficiently computable.
...
...

References

SHOWING 1-10 OF 43 REFERENCES
INFORMATION THEORETIC REGRESSION METHODS
TLDR
The purpose of this paper is to integrate the existing entropy-based methods in a single work, to explore their interrelationships, to elaborate on information theoretic interpretations of the existingropy-based diagnostics and to present informationoretic interpretations for traditional diagnostics.
Developments in Maximum Entropy Data Analysis
The Bayesian derivation of “Classic” MaxEnt image processing (Skilling 1989a) shows that exp(αS(f,m)), where S(f,m) is the entropy of image f relative to model m, is the only consistent prior
Maximum entropy econometrics: robust estimation with limited data
The Classical Maximum Entropy Formalism: A Review. PURE INVERSE PROBLEMS. Basic Maximum Entropy Principle: Formulation and Extensions. Formulation and Solution of Pure Inverse Problems. Generalized
A compendium to information theory in economics and econometrics
An extensive synthesis is provided of the concepts, measures and techniques of Information Theory (IT). After an axiomatic description of the basic definitions of “information functions”, “entropy”
A new look at the statistical model identification
The history of the development of statistical hypothesis testing in time series analysis is reviewed briefly and it is pointed out that the hypothesis testing procedure is not adequately defined as
On communication of analog data from a bounded source space
TLDR
Bounds on the minimum achievable average distortion for memoryless sources are derived both for the case where the coding delay is infinite (an extension of the Shannon Theory) and also for some cases where the coded delay is finite.
Capturing the Intangible Concept of Information
TLDR
The purpose of this article is to develop a general appreciation for the meanings of information functions rather than their mathematical use and to discuss the intricacies of quantifying information in some statistical problems.
Elements of Information Theory
TLDR
The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
...
...