Parsing Algebraic Word Problems into Equations

  title={Parsing Algebraic Word Problems into Equations},
  author={Rik Koncel-Kedziorski and Hannaneh Hajishirzi and Ashish Sabharwal and Oren Etzioni and Siena Dumas Ang},
  journal={Transactions of the Association for Computational Linguistics},
This paper formalizes the problem of solving multi-sentence algebraic word problems as that of generating and scoring equation trees. We use integer linear programming to generate equation trees and score their likelihood by learning local and global discriminative models. These models are trained on a small set of word problems and their answers, without any manual annotation, in order to choose the equation that best matches the problem text. We refer to the overall system as Alges. We… 

EquGener: A Reasoning Network for Word Problem Solving by Generating Arithmetic Equations

This work introduces a novel method where the participants first learn a dense representation of the problem description conditioned on the question in hand, and leverage this representation to generate the operands and operators in the appropriate order.

Tree-structured Decoding for Solving Math Word Problems

A tree-structured decoding method that generates the abstract syntax tree of the equation in a top-down manner and can automatically stop during decoding without a redundant stop token is proposed.

Solving Math Word Problems by Scoring Equations with Recursive Neural Networks

This work explores novel approaches to score candidate solution equations using tree-structured recursive neural network (Tree-RNN) configurations using more established sequential representations, and improves overall performance and outperforms sequential LSTMs on such more complex problems.

Learning To Use Formulas To Solve Simple Arithmetic Problems

A novel method to learn to use formulas to solve simple arithmetic word problems and beats the state-of-the-art by 86.07% of the problems in a corpus of standard primary school test questions.

A Simple Arithmetic Calculator to Solve Single Sentence Mathematical Word Problems

The work involves identification of problem types, simplification of the given word problem to extract the operators and the operands, mapping the operators to appropriate operands to create the mathematical expression and solve the query to generate the final result.

Solving Arithmetic Word Problems Automatically Using Transformer and Unambiguous Representations

  • Kaden GriffithJ. Kalita
  • Computer Science
    2019 International Conference on Computational Science and Computational Intelligence (CSCI)
  • 2019
The use of Transformer networks trained to translate math word problems to equivalent arithmetic expressions in infix, prefix, and postfix notations are outlined and most configurations outperform previously reported approaches on three of four datasets with significant increases in accuracy.

Point to the Expression: Solving Algebraic Word Problems Using the Expression-Pointer Transformer Model

A pure neural model, EPT, is proposed, which can address the expression fragmentation and the operand-context separation, and yields comparable performance to existing models using hand-crafted features, and achieves better performance than existing pure neural models by at most 40%.

Sequence to General Tree: Knowledge-Guided Geometry Word Problem Solving

With the recent advancements in deep learning, neural solvers have gained promising results in solving math word problems. However, these SOTA solvers only generate binary expression trees that

Learning Fine-Grained Expressions to Solve Math Word Problems

A novel template-based method that learns the mappings between math concept phrases in math word problems and their math expressions from training data and does a fine-grained inference to obtain the final answer.

Mapping probability word problems to executable representations

This paper employs and analyse various neural models for answering probability word problems and applies end-to-end models to this task, which bring out the importance of the two-step approach in obtaining correct solutions to probability problems.



Learn to Solve Algebra Word Problems Using Quadratic Programming

This paper presents a new algorithm to automatically solve algebra word problems via analyzing a hypothesis space containing all possible equation systems generated by assigning the numbers in the word problem into a set of equation system templates extracted from the training data.

Solving General Arithmetic Word Problems

This is the first algorithmic approach that can handle arithmetic problems with multiple steps and operations, without depending on additional annotations or predefined templates, and it outperforms existing systems, achieving state of the art performance on benchmark datasets of arithmetic word problems.

Learning to Automatically Solve Algebra Word Problems

An approach for automatically learning to solve algebra word problems by reasons across sentence boundaries to construct and solve a system of linear equations, while simultaneously recovering an alignment of the variables and numbers to the problem text.

Learning to Solve Arithmetic Word Problems with Verb Categorization

The paper analyzes the arithmetic-word problems “genre”, identifying seven categories of verbs used in such problems, and reports the first learning results on this task without reliance on predefined templates and makes the data publicly available.

Inducing Probabilistic CCG Grammars from Logical Form with Higher-Order Unification

This paper uses higher-order unification to define a hypothesis space containing all grammars consistent with the training data, and develops an online learning algorithm that efficiently searches this space while simultaneously estimating the parameters of a log-linear parsing model.

Learning to Map Sentences to Logical Form: Structured Classification with Probabilistic Categorial Grammars

A learning algorithm is described that takes as input a training set of sentences labeled with expressions in the lambda calculus and induces a grammar for the problem, along with a log-linear model that represents a distribution over syntactic and semantic analyses conditioned on the input sentence.

A Linear Programming Formulation for Global Inference in Natural Language Tasks

This work develops a linear programing formulation for this problem and evaluates it in the context of simultaneously learning named entities and relations to efficiently incorporate domain and task specific constraints at decision time, resulting in significant improvements in the accuracy and the "human-like" quality of the inferences.

Learning to Parse Database Queries Using Inductive Logic Programming

Experimental results with a complete database-query application for U.S. geography show that CHILL is able to learn parsers that outperform a preexisting, hand-crafted counterpart, and provide direct evidence of the utility of an empirical approach at the level of a complete natural language application.

Modeling Biological Processes for Reading Comprehension

This paper focuses on a new reading comprehension task that requires complex reasoning over a single document, and demonstrates that answering questions via predicted structures substantially improves accuracy over baselines that use shallower representations.

Reasoning about Quantities in Natural Language

A computational approach is developed which is shown to successfully recognize and normalize textual expressions of quantities and is used to further develop algorithms to assist reasoning in the context of the aforementioned tasks.