Corpus ID: 234469796

Discrete representations in neural models of spoken language

@article{Higy2021DiscreteRI,
  title={Discrete representations in neural models of spoken language},
  author={Bertrand Higy and Lieke Gelderloos and A. Alishahi and Grzegorz Chrupała},
  journal={ArXiv},
  year={2021},
  volume={abs/2105.05582}
}
The distributed and continuous representations used by neural networks are at odds with representations employed in linguistics, which are typically symbolic. Vector quantization has been proposed as a way to induce discrete neural representations that are closer in nature to their linguistic counterparts. However, it is not clear which metrics are the best-suited to analyze such discrete representations. We compare the merits of four commonly used metrics in the context of weakly supervised… Expand

Figures and Tables from this paper

Visually grounded models of spoken language: A survey of datasets, architectures and evaluation techniques

References

SHOWING 1-10 OF 41 REFERENCES
Analyzing analytical methods: The case of phonology in neural models of spoken language
Encoding of phonology in a recurrent neural model of grounded speech
Learning Hierarchical Discrete Linguistic Units from Visually-Grounded Speech
Language learning using Speech to Image retrieval
Textual supervision for visually grounded spoken language understanding
Discourse structure interacts with reference but not syntax in neural language models
Understanding and Improving Word Embeddings through a Neuroscientific Lens
From Audio to Semantics: Approaches to End-to-End Spoken Language Understanding
...
1
2
3
4
5
...