• Corpus ID: 231951510

Learning Continuous Exponential Families Beyond Gaussian

@article{Ren2021LearningCE,
  title={Learning Continuous Exponential Families Beyond Gaussian},
  author={Christopher Ren and Sidhant Misra and Marc Vuffray and Andrey Y. Lokhov},
  journal={ArXiv},
  year={2021},
  volume={abs/2102.09198}
}
We address the problem of learning of continuous exponential family distributions with unbounded support. While a lot of progress has been made on learning of Gaussian graphical models, we are still lacking scalable algorithms for reconstructing general continuous exponential families modeling higher-order moments of the data beyond the mean and the covariance. Here, we introduce a computationally efficient method for learning continuous graphical models based on the Interaction Screening… 
1 Citations

Figures from this paper

A Computationally Efficient Method for Learning Exponential Family Distributions

TLDR
This work proposes a computationally efficient estimator that is consistent as well as asymptotically normal under mild conditions and shows that, at the population level, this method can be viewed as the maximum likelihood estimation of a re-parameterized distribution belonging to the same class of exponential family.

References

SHOWING 1-10 OF 50 REFERENCES

Beyond normality: Learning sparse probabilistic graphical models in the non-Gaussian setting

TLDR
An algorithm to identify sparse dependence structure in continuous and non-Gaussian probability distributions, given a corresponding set of data is presented, which relies on exploiting the connection between the sparsity of the graph and theSparsity of transport maps, which deterministically couple one probability measure to another.

A Computationally Efficient Method for Learning Exponential Family Distributions

TLDR
This work proposes a computationally efficient estimator that is consistent as well as asymptotically normal under mild conditions and shows that, at the population level, this method can be viewed as the maximum likelihood estimation of a re-parameterized distribution belonging to the same class of exponential family.

On Learning Continuous Pairwise Markov Random Fields

TLDR
This work considers learning a sparse pairwise Markov Random Field with continuous-valued variables from i.i.d samples and establishes that the population version of the optimization criterion employed in Vuffray et al. (2019) can be interpreted as local maximum likelihood estimation (MLE).

The Expxorcist: Nonparametric Graphical Models Via Conditional Exponential Densities

TLDR
This paper leverage recent developments to propose a class of non-parametric models which have very attractive computational and statistical properties and relies on the simple function space assumption that the conditional distribution of each variable conditioned on the other variables has a non- Parametric exponential family form.

Learning Additive Exponential Family Graphical Models via \ell_{2, 1}-norm Regularized M-Estimation

TLDR
Two -norm regularized maximum likelihood estimators to learn the model parameters from i.i.d. samples are proposed and it is shown that under mild conditions the extra flexibility gained by the additive exponential family models comes at almost no cost of statistical efficiency.

Vector-Space Markov Random Fields via Exponential Families

TLDR
VS-MRFs generalize a recent line of work on scalar-valued, uni-parameter exponential family and mixed graphical models, thereby greatly broadening the class of exponential families available (e.g., allowing multinomial and Dirichlet distributions).

Learning Some Popular Gaussian Graphical Models without Condition Number Bounds

TLDR
This work gives the first polynomial-time algorithms for learning attractive GGMs and walk-summable GGMs with a logarithmic number of samples without any such assumptions and can tolerate strong dependencies among the variables.

On Semiparametric Exponential Family Graphical Models

TLDR
This work proposes a new class of semiparametric exponential family graphical models for the analysis of high dimensional mixed data and proposes a symmetric pairwise score test for the presence of a single edge in the graph, taking into account the symmetry of the parameters.

Efficient learning of discrete graphical models

TLDR
This work provides the first sample-efficient method based on the interaction screening framework that allows one to provably learn fully general discrete factor models with node-specific discrete alphabets and multi-body interactions, specified in an arbitrary basis.

High-dimensional Gaussian graphical model selection: walk summability and local separation criterion

TLDR
This work identifies a set of graphs for which an efficient estimation algorithm exists, and this algorithm is based on thresholding of empirical conditional covariances, and establishes structural consistency (or sparsistency) for the proposed algorithm, when the number of samples n = Ω(Jmin-2 log p) and Jmin is the minimum (absolute) edge potential of the graphical model.