Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter Extinction

@article{Brand1999StructureLI,
  title={Structure Learning in Conditional Probability Models via an Entropic Prior and Parameter Extinction},
  author={Matthew Brand},
  journal={Neural Computation},
  year={1999},
  volume={11},
  pages={1155-1182}
}
We introduce an entropic prior for multinomial parameter estimation problems and solve for its maximum a posteriori (MAP) estimator. The prior is a bias for maximally structured and minimally ambiguous models. In conditional probability models with hidden state, iterative MAP estimation drives weakly supported parameters toward extinction, effectively turning them off. Thus, structure discovery is folded into parameter estimation. We then establish criteria for simplifying a probabilistic model… CONTINUE READING
BETA

Citations

Publications citing this paper.
SHOWING 1-10 OF 106 CITATIONS, ESTIMATED 23% COVERAGE

FILTER CITATIONS BY YEAR

2000
2017

CITATION STATISTICS

  • 10 Highly Influenced Citations

  • Averaged 1 Citations per year over the last 3 years

References

Publications referenced by this paper.
SHOWING 1-10 OF 44 REFERENCES

Entropic estimation blends continuous and discrete optimization (Technical Rep.). Cambridge, MA: Mitsubishi Electric Research Labs

  • M. Brand
  • 1999
Highly Influential
5 Excerpts

Neural networks for pattern recognition

  • C. Bishop
  • New York: Oxford University Press.
  • 1995
Highly Influential
2 Excerpts

Entropic priors (Tech

  • C. C. Rodriguez
  • Rep.). Albany: State University of New York at…
  • 1991
Highly Influential
1 Excerpt

The number of sparsely connected edged graphs

  • E. M. Wright
  • Journal of Graph Theory, 1, 317–330.
  • 1977
Highly Influential
2 Excerpts

Similar Papers

Loading similar papers…