Corpus ID: 221090443

# Navigating Language Models with Synthetic Agents

@article{Feldman2020NavigatingLM,
title={Navigating Language Models with Synthetic Agents},
author={Philip G. Feldman},
journal={ArXiv},
year={2020},
volume={abs/2008.04162}
}
Modern natural language models such as the GPT-2/GPT-3 contain tremendous amounts of information about human belief in a consistently interrogatable form. If these models could be shown to accurately reflect the underlying beliefs of the human beings that produced the data used to train these models, then such models become a powerful sociological tool in ways that are distinct from traditional methods, such as interviews and surveys. In this study, We train a version of the GPT-2 on a corpora… Expand
1 Citations

#### Figures and Tables from this paper

Analyzing COVID-19 Tweets with Transformer-based Language Models
• Computer Science
• ArXiv
• 2021
The results on the COVID-19 tweet data show that transformer language models are promising tools that can help us understand public opinions on social media at scale. Expand

#### References

SHOWING 1-10 OF 24 REFERENCES
BERTology Meets Biology: Interpreting Attention in Protein Language Models
• Computer Science, Biology
• ICLR
• 2021
The inner workings of the Transformer is analyzed and it is shown that attention captures the folding structure of proteins, connecting amino acids that are far apart in the underlying sequence, but spatially close in the three-dimensional structure. Expand
A snapshot of the frontiers of fairness in machine learning
• Computer Science
• Commun. ACM
• 2020
A group of industry, academic, and government experts convene in Philadelphia to explore the roots of algorithmic bias.
Animals in Virtual Environments
• Medicine, Computer Science
• IEEE Transactions on Visualization and Computer Graphics
• 2020
This review provides an overview of animal behavior experiments conducted in virtual environments and indicates that VE for animals is becoming a widely used application of XR technology but such applications have not previously been reported in the technical literature related to XR. Expand
Language Models are Few-Shot Learners
GPT-3 achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation, such as unscrambling words, using a novel word in a sentence, or performing 3-digit arithmetic. Expand
Lessons from archives: strategies for collecting sociocultural data in machine learning
• Computer Science
• FAT*
• 2020
It is argued that a new specialization should be formed within ML that is focused on methodologies for data collection and annotation: efforts that require institutional frameworks and procedures for sociocultural data collection. Expand
The Curious Case of Neural Text Degeneration
• Computer Science
• ICLR
• 2020
By sampling text from the dynamic nucleus of the probability distribution, which allows for diversity while effectively truncating the less reliable tail of the distribution, the resulting text better demonstrates the quality of human text, yielding enhanced diversity without sacrificing fluency and coherence. Expand
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
• Computer Science
• NAACL
• 2019
A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks. Expand
Dynamic Knowledge Graph Construction for Zero-shot Commonsense Question Answering
• Computer Science
• ArXiv
• 2019
Empirical results on the SocialIQa and StoryCommonsense datasets in a zero-shot setting demonstrate that using commonsense knowledge models to dynamically construct and reason over knowledge graphs achieves performance boosts over pre-trained language models and usingknowledge models to directly evaluate answers. Expand
HuggingFace's Transformers: State-of-the-art Natural Language Processing
The \textit{Transformers} library is an open-source library that consists of carefully engineered state-of-the art Transformer architectures under a unified API and a curated collection of pretrained models made by and available for the community. Expand
Language Models are Unsupervised Multitask Learners
• Computer Science
• 2019
It is demonstrated that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText, suggesting a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations. Expand