• Publications
  • Influence
An Embodied Real-Time Model of Language-Guided Incremental Visual Search
TLDR
This paper presents an embodied real-time model of interactive incremental vision and natural language process- ing that can explain previous experimental findings in a novel way by showing that divergent results found in different ex- perimental conditions by Spivey et al. (2001) might not be due to differences in processing configurations.
ParsiNLU: A Suite of Language Understanding Challenges for Persian
TLDR
This work introduces ParsiNLU, the first benchmark in Persian language that includes a range of language understanding tasks—reading comprehension, textual entailment, and so on, and presents the first results on state-of-the-art monolingual and multilingual pre-trained language models on this benchmark and compares them with human performance.
Beyond the Imitation Game: Quantifying and extrapolating the capabilities of language models
TLDR
Evaluation of OpenAI's GPT models, Google-internal dense transformer architectures, and Switch-style sparse transformers on BIG-bench, across model sizes spanning millions to hundreds of billions of parameters finds that model performance and calibration both improve with scale, but are poor in absolute terms.
An embodied incremental Bayesian model of cross-situational word learning
TLDR
This work presents an incremental Bayesian model of cross-situational word learning with limited access to past situations and demonstrates its superior performance compared to other baseline incremental models, especially under conditions of sensory noise in the speech and visual modalities.
Joint acquisition of word order and word referent in a memory-limited and incremental learner
TLDR
This work studies the utility of joint acquisition of simple versions of word order and word meaning in early stages of acquisition in a memory-limited incremental model and results were limited and only pronounced in the presence of high referential ambiguity and delayed syntactic bootstrapping.
Early Syntactic Bootstrapping in an Incremental Memory-Limited Word Learner
TLDR
A probabilistic framework for early syntactic bootstrapping in the absence of advanced structured representations is presented and joint acquisition of word order and word referent facilitates one-shot learning of new words as well as inferring intentions of the speaker in ambiguous contexts.
Models of Cross-Situational and Crossmodal Word Learning in Task-Oriented Scenarios
TLDR
A Bayesian approach for co-learning object-word mappings and referential intention which allows for incremental learning from only a few situations where the display of referents to the learning system is systematically varied is presented.
Sensitivity to Input Order: Evaluation of an Incremental and Memory-Limited Bayesian Cross-Situational Word Learning Model
TLDR
A variation of the incremental and memory-limited algorithm for Bayesian cross-situational word learning is presented and it is shown that the functional performance of the sub-optimal model on corpus data is close to that of its optimal counterpart.
A Hubel Wiesel model of early concept generalization based on local correlation of input features
TLDR
The input integration framework is proposed - a set of operations performed on the inputs to the learning modules of the Hubel Wiesel model of conceptual memory that can be used to explain how humans intuitively fit a hierarchical representation for any kind of data.
Acquisition of Word-Object Associations from Human-Robot and Human-Human Dialogues
TLDR
The expanded word learning capabilities in the outcome system are demonstrated and how learning from both human-human and human-robot dialogues can be achieved in one integrated system is demonstrated.
...
...