Corpus ID: 209323956

Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence

@article{CheriefAbdellatif2019FiniteSP,
  title={Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence},
  author={Badr-Eddine Ch'erief-Abdellatif and Pierre Alquier},
  journal={ArXiv},
  year={2019},
  volume={abs/1912.05737}
}
Many works in statistics aim at designing a universal estimation procedure, that is, an estimator that would converge to the best approximation of the (unknown) data generating distribution in a model, without any assumption on this distribution. This question is of major interest, in particular because the universality property leads to the robustness of the estimator. In this paper, we tackle the problem of universal estimation using a minimum distance estimator presented in Briol et al… Expand
Estimation of copulas via Maximum Mean Discrepancy
Bayesian Neural Networks With Maximum Mean Discrepancy Regularization
Calibration of Stochastic Radio Channel Models with Kernels.

References

SHOWING 1-10 OF 142 REFERENCES
Mixing: Properties and Examples
Training generative neural networks via Maximum Mean Discrepancy optimization
Rates of Convergence of Minimum Distance Estimators and Kolmogorov's Entropy
L
  • (2016). Rho-estimators revisited: General theory and applications. arXiv preprint arXiv:1605.05051,
  • 2016
Rho-estimators revisited
  • General theory and applications
  • 2016
On McDiarmid's concentration inequality
On McDiarmid's concentration inequality. Electronic Communications in Probability
  • 2013
Theoremes limites pour des suites positivement ou faiblement dependantes
Approximation dans les espaces métriques et théorie de l'estimation
L
  • (1983). Approximation dans les espaces métriques et théorie de l’estimation. Annales de l’Institut Henri Poincare (B) Probability and Statistics, 65:2, pp.181-237,
  • 1983
...
1
2
3
4
5
...