# Statistical Learning Theory of Quasi-Regular Cases

@article{Yamada2012StatisticalLT, title={Statistical Learning Theory of Quasi-Regular Cases}, author={Koshi Yamada and Sumio Watanabe}, journal={IEICE Trans. Fundam. Electron. Commun. Comput. Sci.}, year={2012}, volume={95-A}, pages={2479-2487} }

Many learning machines such as normal mixtures and layered neural networks are not regular but singular statistical models, because the map from a parameter to a probability distribution is not one-to-one. The conventional statistical asymptotic theory can not be applied to such learning machines because the likelihood function can not be approximated by any normal distribution. Recently, new statistical theory has been established based on algebraic geometry and it was clarified that the…

## Tables from this paper

## 3 Citations

Information criterion for variational Bayes learning in regular and singular cases

- Computer Science, MathematicsThe 6th International Conference on Soft Computing and Intelligent Systems, and The 13th International Symposium on Advanced Intelligence Systems
- 2012

This paper proposes a new information criterion for variational Bayes learning, which is the unbiased estimator of the generalization loss for both cases when the posterior distribution is regular and singular.

Marginal likelihood and model selection for Gaussian latent tree and forest models

- Computer Science, Mathematics
- 2014

The real log-canonical thresholds (also known as stochastic complexities or learning coefficients) that quantify the large-sample behavior of the marginal likelihood in Bayesian inference are computed.

Design of teaching quality evaluation model based on fuzzy mathematics and SVM algorithm

- Computer ScienceJ. Intell. Fuzzy Syst.
- 2018

## References

SHOWING 1-10 OF 17 REFERENCES

Algebraic geometrical methods for hierarchical learning machines

- Computer ScienceNeural Networks
- 2001

Singularities in mixture models and upper bounds of stochastic complexity

- Computer Science, MathematicsNeural Networks
- 2003

Algebraic Analysis for Nonidentifiable Learning Machines

- Computer Science, MathematicsNeural Computation
- 2001

It is rigorously proved that the Bayesian stochastic complexity or the free energy is asymptotically equal to 1 logn (m1 1) loglogn + constant, where n is the number of training samples and 1 and m1 are the rational number and the natural number, which are determined as the birational invariant values of the singularities in the parameter space.

Asymptotic Equivalence of Bayes Cross Validation and Widely Applicable Information Criterion in Singular Learning Theory

- Computer Science, MathematicsJ. Mach. Learn. Res.
- 2010

The Bayes cross-validation loss is asymptotically equivalent to the widely applicable information criterion as a random variable and model selection and hyperparameter optimization using these two values are asymPTOTically equivalent.

Stochastic complexities of reduced rank regression in Bayesian estimation

- Mathematics, Computer ScienceNeural Networks
- 2005

On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario

- Computer ScienceNeural Computation
- 2002

The article analyzes the expected training error and the expected generalization error of neural networks and radial basis functions in overrealizable cases and clarifies the difference from regular models, for which identifiability holds.

On the Asymptotic Distribution of the Least-Squares Estimators in Unidentifiable Models

- MathematicsNeural Computation
- 2004

In order to analyze the stochastic property of multilayered perceptrons or other learning machines, we deal with simpler models and derive the asymptotic distribution of the least-squares estimators…

B-functions and holonomic systems

- Mathematics
- 1976

A b-function of an analytic function f(x) is, by definition, a gcnerator of the ideal formed by the polynomials b(s) satisfying P(s, x, Dx) f (x)" + 1 = b(s) f(x) ~ for some differential operator…

Resolution of Singularities and the Generalization Error with Bayesian Estimation for Layered Neural Network

- Mathematics
- 2005

a) E-mail: miki-a@sophia.ac.jp b) E-mail: swatanab@pi.titech.ac.jp 近年,ベイズ推測において重要な役割を果たす確率 的複雑さが,次のように特異点解消により構成された 方法論と数理的に緊密な関係をもつことが明らかに なった [1]~[3].サンプル数が nのときの学習モデルの 確率的複雑さ F (n) は,ある有理数 λと自然数…