Distributed Vector Representations of Words in the Sigma Cognitive Architecture

Abstract

Recently reported results with distributed-vector word representations in natural language processing make them appealing for incorporation into a general cognitive architecture like Sigma. This paper describes a new algorithm for learning such word representations from large, shallow information resources, and how this algorithm can be implemented via… (More)
DOI: 10.1007/978-3-319-09274-4_19

4 Figures and Tables

Topics

  • Presentations referencing similar topics