Corpus ID: 232104808

NeurIPS 2020 NLC2CMD Competition: Translating Natural Language to Bash Commands

  title={NeurIPS 2020 NLC2CMD Competition: Translating Natural Language to Bash Commands},
  author={Mayank Agarwal and Tathagata Chakraborti and Quchen Fu and David Gros and Xi Victoria Lin and Jaron Maene and Kartik Talamadupula and Zhongwei Teng and Jules White},
The NLC2CMD Competition hosted at NeurIPS 2020 aimed to bring the power of natural language processing to the command line. Participants were tasked with building models that can transform descriptions of command line tasks in English to their Bash syntax. This is a report on the competition with details of the task, metrics, data, attempted solutions, and lessons learned. 

Figures and Tables from this paper

A Transformer-based Approach for Translating Natural Language to Bash Commands
This paper explores the translation of natural language into Bash Commands, which developers commonly use to accomplish command-line tasks in a terminal. In our approach a terminal takes a command asExpand
Explainable Natural Language to Bash Translation using Abstract Syntax Tree
Natural language processing for program synthesis has been widely researched. In this work, we focus on generating Bash commands from natural language invocations with explanations. We propose aExpand


NL2Bash: A Corpus and Semantic Parser for Natural Language Interface to the Linux Operating System
This work presents new data and semantic parsing methods for the problem of mapping English sentences to Bash commands (NL2Bash), and takes a first step in enabling any user to perform operations by simply stating their goals in English. Expand
Photon: A Robust Cross-Domain Text-to-SQL System
PHOTON is presented, a robust, modular, cross-domain NLIDB that can flag natural language input to which a SQL mapping cannot be immediately determined and effectively improves the robustness of text-to-SQL system against untranslatable user input. Expand
Program Synthesis from Natural Language Using Recurrent Neural Networks
O‰entimes, a programmer may have diculty implementing a desired operation. Even when the programmer can describe her goal in English, it can be dicult to translate into code. Existing resources,Expand
Language Models are Unsupervised Multitask Learners
It is demonstrated that language models begin to learn these tasks without any explicit supervision when trained on a new dataset of millions of webpages called WebText, suggesting a promising path towards building language processing systems which learn to perform tasks from their naturally occurring demonstrations. Expand
Understanding Back-Translation at Scale
This work broadens the understanding of back-translation and investigates a number of methods to generate synthetic source sentences, finding that in all but resource poor settings back-translations obtained via sampling or noised beam outputs are most effective. Expand
Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation
An up-to-date synthesis of research on the core tasks in NLG and the architectures adopted in which such tasks are organised is given, to highlight a number of recent research topics that have arisen partly as a result of growing synergies betweenNLG and other areas of artifical intelligence. Expand
Seq2SQL: Generating Structured Queries from Natural Language using Reinforcement Learning
This work proposes Seq2 SQL, a deep neural network for translating natural language questions to corresponding SQL queries, and releases WikiSQL, a dataset of 80654 hand-annotated examples of questions and SQL queries distributed across 24241 tables fromWikipedia that is an order of magnitude larger than comparable datasets. Expand
AInix: An open platform for natural language interfaces to shell commands
The following sections discuss the results of each model and the AInix Kernel Dataset, and primary results are listed in Table 1. Expand
A Syntactic Neural Model for General-Purpose Code Generation
A novel neural architecture powered by a grammar model to explicitly capture the target syntax as prior knowledge for semantic parsing is proposed, achieving state-of-the-art results that well outperform previous code generation and semantic parsing approaches. Expand
Seq 2 SQL : Generating Structured Queries from Natural Language using Reinforcement Learning
A significant amount of the world’s knowledge is stored in relational databases. However, the ability for users to retrieve facts from a database is limited due to a lack of understanding of queryExpand