A Bayesian framework for word segmentation: Exploring the effects of context

@article{Goldwater2009ABF,
  title={A Bayesian framework for word segmentation: Exploring the effects of context},
  author={Sharon Goldwater and Thomas L. Griffiths and Mark I. Johnson},
  journal={Cognition},
  year={2009},
  volume={112},
  pages={21-54}
}
  • Sharon Goldwater, Thomas L. Griffiths, Mark I. Johnson
  • Published in Cognition 2009
  • Psychology, Medicine
  • Since the experiments of Saffran et al. [Saffran, J., Aslin, R., & Newport, E. (1996). Statistical learning in 8-month-old infants. Science, 274, 1926-1928], there has been a great deal of interest in the question of how statistical regularities in the speech stream might be used by infants to begin to identify individual words. In this work, we use computational modeling to explore the effects of different assumptions the learner might make regarding the nature of words--in particular, how… CONTINUE READING

    Create an AI-powered research feed to stay up to date with new papers like this posted to ArXiv

    Citations

    Publications citing this paper.
    SHOWING 1-10 OF 302 CITATIONS

    Learning to Discover, Ground and Use Words with Segmental Neural Language Models

    VIEW 10 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    The Role of Empirical Evidence in Modeling Speech Segmentation

    VIEW 12 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    A Particle Filter algorithm for Bayesian Wordsegmentation

    VIEW 16 EXCERPTS
    CITES BACKGROUND, RESULTS & METHODS
    HIGHLY INFLUENCED

    Linguistic Constraints on Statistical Word Segmentation: The Role of Consonants in Arabic and English.

    VIEW 14 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Unsupervised Neural Word Segmentation for Chinese via Segmental Language Modeling

    VIEW 14 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Unsupervised Word Discovery with Segmental Neural Language Models

    VIEW 9 EXCERPTS
    CITES BACKGROUND & METHODS
    HIGHLY INFLUENCED

    Evaluating Language Acquisition Models: A Utility-Based Look at Bayesian Segmentation

    VIEW 14 EXCERPTS
    CITES METHODS & BACKGROUND
    HIGHLY INFLUENCED

    Unsupervised neural and Bayesian models for zero-resource speech processing

    VIEW 8 EXCERPTS
    CITES METHODS, BACKGROUND & RESULTS
    HIGHLY INFLUENCED

    FILTER CITATIONS BY YEAR

    2008
    2020

    CITATION STATISTICS

    • 54 Highly Influenced Citations

    • Averaged 26 Citations per year from 2017 through 2019

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 92 REFERENCES

    Stochastic Complexity in Statistical Inquiry Theory

    VIEW 8 EXCERPTS
    HIGHLY INFLUENTIAL

    Bayesian learning of visual chunks by human observers.

    VIEW 3 EXCERPTS
    HIGHLY INFLUENTIAL

    Statistical clustering and the contents of the infant vocabulary

    VIEW 5 EXCERPTS
    HIGHLY INFLUENTIAL

    Accessor Variety Criteria for Chinese Word Extraction

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL

    An Efficient, Probabilistically Sound Algorithm for Segmentation and Word Discovery

    VIEW 4 EXCERPTS
    HIGHLY INFLUENTIAL