• Corpus ID: 235727257

Meta-Learning for Relative Density-Ratio Estimation

@inproceedings{Kumagai2021MetaLearningFR,
  title={Meta-Learning for Relative Density-Ratio Estimation},
  author={Atsutoshi Kumagai and Tomoharu Iwata and Yasuhiro Fujiwara},
  booktitle={NeurIPS},
  year={2021}
}
The ratio of two probability densities, called a density-ratio, is a vital quantity in machine learning. In particular, a relative density-ratio, which is a bounded extension of the density-ratio, has received much attention due to its stability and has been used in various applications such as outlier detection and dataset comparison. Existing methods for (relative) density-ratio estimation (DRE) require many instances from both densities. However, sufficient instances are often unavailable in… 
Unified Perspective on Probability Divergence via Maximum Likelihood Density Ratio Estimation: Bridging KL-Divergence and Integral Probability Metrics
TLDR
It is shown that the KL-divergence and the IPMs can be represented as maximal likelihoods differing only by sampling schemes, and this result is used to derive a unified form ofThe IPMs and a relaxed estimation method.

References

SHOWING 1-10 OF 54 REFERENCES
Non-Negative Bregman Divergence Minimization for Deep Direct Density Ratio Estimation
TLDR
This paper introduces a non-negative correction for empirical risk using only the prior knowledge of the upper bound of the density ratio, which makes a DRE method more robust against overfitting and enables the use of flexible models.
Relative Density-Ratio Estimation for Robust Distribution Comparison
TLDR
This letter uses relative divergences for distribution comparison, which involves approximation of relative density ratios, and shows that the proposed divergence estimator has asymptotic variance independent of the model complexity under a parametric setup, implying that the suggested estimator hardly overfits even with complex models.
Statistical outlier detection using direct density ratio estimation
TLDR
A new statistical approach to the problem of inlier-based outlier detection, i.e., finding outliers in the test set based on the training set consisting only of inliers, using the ratio of training and test data densities as an outlier score is proposed.
Density Ratio Estimation in Machine Learning
TLDR
The authors offer a comprehensive introduction of various density ratio estimators including methods via density estimation, moment matching, probabilistic classification, density fitting, and density ratio fitting as well as describing how these can be applied to machine learning.
Multistream Classification with Relative Density Ratio Estimation
TLDR
This paper proposes a novel mechanism to automatically learn an appropriate mixture of relative density that adapts to changes in the multistream setting over time, and theoretically study its properties and empirically demonstrate its superior performance on benchmark datasets by comparing with other competing methods.
Trimmed Density Ratio Estimation
TLDR
A robust estimator is presented which automatically identifies and trims outliers and has a convex formulation, and the global optimum can be obtained via subgradient descent.
Direct importance estimation for covariate shift adaptation
TLDR
This paper proposes a direct importance estimation method that does not involve density estimation and is equipped with a natural cross validation procedure and hence tuning parameters such as the kernel width can be objectively optimized.
Direct Approximation of Divergences Between Probability Distributions
TLDR
This chapter reviews recent advances in direct divergence approximation that follow the general inference principle advocated by Vladimir Vapnik—one should not solve a more general problem as an intermediate step when approximating a divergence.
A Least-squares Approach to Direct Importance Estimation
TLDR
This paper proposes a new importance estimation method that has a closed-form solution; the leave-one-out cross-validation score can also be computed analytically and is computationally highly efficient and simple to implement.
Direct Importance Estimation with Model Selection and Its Application to Covariate Shift Adaptation
TLDR
This paper proposes a direct importance estimation method that does not involve density estimation and is equipped with a natural cross validation procedure and hence tuning parameters such as the kernel width can be objectively optimized.
...
1
2
3
4
5
...