• Corpus ID: 235743177

Featurized Density Ratio Estimation

@inproceedings{Choi2021FeaturizedDR,
  title={Featurized Density Ratio Estimation},
  author={Kristy Choi and Madeline Liao and Stefano Ermon},
  booktitle={UAI},
  year={2021}
}
Density ratio estimation serves as an important technique in the unsupervised machine learning toolbox. However, such ratios are difficult to estimate for complex, high-dimensional data, particu-larly when the densities of interest are sufficiently different. In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation. This featurization brings the densities closer together in latent space, sidestepping… 
Density Ratio Estimation via Infinitesimal Classification
TLDR
This work proposes DRE- 8, a divide-and-conquer approach to reduce DRE to a series of easier subproblems, and shows that traditional (Stein) scores can be used to obtain integration paths that connect regions of high density in both distributions, improving performance in practice.
A Unified Framework for Multi-distribution Density Ratio Estimation
TLDR
A general framework from the perspective of Bregman divergence minimization is developed, justifying the use of any strictly proper scoring rule composite with a link function for multi-distribution DRE and leading to methods that strictly generalize their counterparts in binary DRE, as well as new methods that show comparable or superior performance on various downstream tasks.
Unified Perspective on Probability Divergence via Maximum Likelihood Density Ratio Estimation: Bridging KL-Divergence and Integral Probability Metrics
TLDR
It is shown that the KL-divergence and the IPMs can be represented as maximal likelihoods differing only by sampling schemes, and this result is used to derive a unified form ofThe IPMs and a relaxed estimation method.
Approximate Data Deletion in Generative Models
TLDR
This paper proposes a density-ratio-based framework for generative models and introduces a fast method for approximate data deletion and a statistical test for estimating whether or not training points have been deleted.

References

SHOWING 1-10 OF 63 REFERENCES
Telescoping Density-Ratio Estimation
TLDR
This work introduces a new framework, telescoping density-ratio estimation (TRE), that enables the estimation of ratios between highly dissimilar densities in high-dimensional spaces and demonstrates that TRE can yield substantial improvements over existing single-Ratio methods for mutual information estimation, representation learning and energy-based modelling.
Direct Density Ratio Estimation for Large-scale Covariate Shift Adaptation
TLDR
This work proposes a novel method that allows us to directly estimate the importance from samples without going through the hard task of density estimation, and demonstrates that the proposed method is computationally more efficient than existing approaches with comparable accuracy.
MADE: Masked Autoencoder for Distribution Estimation
TLDR
This work introduces a simple modification for autoencoder neural networks that yields powerful generative models and proves that this approach is competitive with state-of-the-art tractable distribution estimators.
A RAD approach to deep mixture models
TLDR
This Real and Discrete (RAD) approach retains the desirable normalizing flow properties of exact sampling, exact inference, and analytically computable probabilities, while at the same time allowing simultaneous modeling of both continuous and discrete structure in a data distribution.
Density estimation using Real NVP
TLDR
This work extends the space of probabilistic models using real-valued non-volume preserving (real NVP) transformations, a set of powerful invertible and learnable transformations, resulting in an unsupervised learning algorithm with exact log-likelihood computation, exact sampling, exact inference of latent variables, and an interpretable latent space.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
TLDR
This paper uses Hutchinson's trace estimator to give a scalable unbiased estimate of the log-density and demonstrates the approach on high-dimensional density estimation, image generation, and variational inference, achieving the state-of-the-art among exact likelihood methods with efficient sampling.
Learning Likelihoods with Conditional Normalizing Flows
TLDR
This work provides an effective method to train continuous CNFs for binary problems and applies them to super-resolution and vessel segmentation tasks demonstrating competitive performance on standard benchmark datasets in terms of likelihood and conventional metrics.
Invertible Residual Networks
TLDR
The empirical evaluation shows that invertible ResNets perform competitively with both state-of-the-art image classifiers and flow-based generative models, something that has not been previously achieved with a single architecture.
Relative Density-Ratio Estimation for Robust Distribution Comparison
TLDR
This letter uses relative divergences for distribution comparison, which involves approximation of relative density ratios, and shows that the proposed divergence estimator has asymptotic variance independent of the model complexity under a parametric setup, implying that the suggested estimator hardly overfits even with complex models.
Glow: Generative Flow with Invertible 1x1 Convolutions
TLDR
Glow, a simple type of generative flow using an invertible 1x1 convolution, is proposed, demonstrating that a generative model optimized towards the plain log-likelihood objective is capable of efficient realistic-looking synthesis and manipulation of large images.
...
...