MathQA: Towards Interpretable Math Word Problem Solving with Operation-Based Formalisms

@article{Amini2019MathQATI,
  title={MathQA: Towards Interpretable Math Word Problem Solving with Operation-Based Formalisms},
  author={Aida Amini and Saadia Gabriel and Shanchuan Lin and Rik Koncel-Kedziorski and Yejin Choi and Hannaneh Hajishirzi},
  journal={ArXiv},
  year={2019},
  volume={abs/1905.13319}
}
We introduce a large-scale dataset of math word problems and an interpretable neural math problem solver that learns to map problems to operation programs. [] Key Method We additionally introduce a neural sequence-to-program model enhanced with automatic problem categorization. Our experiments show improvements over competitive baselines in our MathQA as well as the AQuA dataset. The results are still significantly lower than human performance indicating that the dataset poses new challenges for future…

Figures and Tables from this paper

RODA: Reverse Operation Based Data Augmentation for Solving Math Word Problems
TLDR
A novel data augmentation method is proposed that reverses the mathematical logic of math word problems to produce new high-quality math problems and introduce new knowledge points that can benefit learning the mathematical reasoning logic.
Are NLP Models really able to Solve Simple Math Word Problems?
TLDR
It is shown that MWP solvers that do not have access to the question asked in the MWP can still solve a large fraction of MWPs, and models that treat MWPs as bag-of-words can also achieve surprisingly high accuracy.
MWP-BERT: A Strong Baseline for Math Word Problems
TLDR
This work introduces MWP-BERT to obtain pre-trained token representations that capture the alignment between text description and mathematical logic and introduces a keywordbased prompt matching method to address the MWPs requiring common-sense knowledge.
Recycling Numeracy Data Augmentation with Symbolic Verification for Math Word Problem Solving
TLDR
This work proposes a novel recycling numeracy data augmentation (RNDA) approach that automatically generates high quality training instances in the MathQA style and shows that the model trained on the augmented data achieves the state-of-the-art performance.
Learning by Fixing: Solving Math Word Problems with Weak Supervision
TLDR
To boost weaklysupervised learning, a novel learning-by-fixing (LBF) framework is proposed, which corrects the misperceptions of the neural network via symbolic reasoning and achieves comparable top-1 and much better top-3/5 answer accuracies than fully-supervised methods.
LogicSolver: Towards Interpretable Math Word Problem Solving with Logical Prompt-enhanced Learning
TLDR
Experimental results show that the LogicSolver has stronger logical formula-based interpretability than baselines while achieving higher answer accuracy with the help of logical prompts, simultaneously.
Investigating Math Word Problems using Pretrained Multilingual Language Models
TLDR
The experiments show that the MWP solvers may not be transferred to a different language even if the target expressions share the same numerical constants and operator set, and it can be better generalized if problem types exist on both source language and target language.
Why are NLP Models Fumbling at Elementary Math? A Survey of Deep Learning based Word Problem Solvers
TLDR
This paper critically examine the various models that have been developed for solving word problems, their pros and cons and the challenges ahead, and endeavours to provide a road-map for future math word problem research.
Math Word Problem Generation with Mathematical Consistency and Problem Context Constraints
TLDR
A novel MWP generation approach is developed that leverages i) pre-trained language models and a context keyword selection model to improve the language quality of generated MWPs and ii) an equation consistency constraint for math equations to improved the mathematical validity of the generatedMWPs.
Practice Makes a Solver Perfect: Data Augmentation for Math Word Problem Solvers
TLDR
This paper proposes several data augmentation techniques broadly categorized into Substitution and Paraphrasing based methods and shows that proposed methods increase the generalization and robustness of existing solvers.
...
...

References

SHOWING 1-10 OF 27 REFERENCES
Using Intermediate Representations to Solve Math Word Problems
TLDR
This work uses a sequence-to-sequence model with a novel attention regularization term to generate the intermediate forms, then executes them to obtain the final answers, and proposes an iterative labeling framework for learning by leveraging supervision signals from both equations and answers.
Mapping to Declarative Knowledge for Word Problem Solving
TLDR
Declarative rules which govern the translation of natural language description of these concepts to math expressions are developed, and a framework for incorporating such declarative knowledge into word problem solving is presented.
A Meaning-Based Statistical English Math Word Problem Solver
TLDR
Experimental results show that the MeSys approach outperforms existing systems on both benchmark datasets and the noisy dataset, which demonstrates that the proposed approach understands the meaning of each quantity in the text more.
Program Induction by Rationale Generation: Learning to Solve and Explain Algebraic Word Problems
TLDR
Experimental results show that indirect supervision of program learning via answer rationales is a promising strategy for inducing arithmetic programs.
How well do Computers Solve Math Word Problems? Large-Scale Dataset Construction and Evaluation
TLDR
A large-scale dataset which is more than 9 times the size of previous ones, and contains many more problem types, and is trained to automatically extract problem answers from the answer text provided by CQA users, which significantly reduces human annotation cost.
Modeling Math Word Problems with Augmented Semantic Networks
TLDR
This work proposes a model based on augmented semantic networks to represent the mathematical structure behind word problems that is able to understand and solve mathematical text problems from German primary school books and could be extended to other languages by exchanging the language model in the natural language processing module.
MAWPS: A Math Word Problem Repository
TLDR
MAWPS allows for the automatic construction of datasets with particular characteristics, providing tools for tuning the lexical and template overlap of a dataset as well as for filtering ungrammatical problems from web-sourced corpora.
Deep Neural Solver for Math Word Problems
TLDR
Experiments conducted on a large dataset show that the RNN model and the hybrid model significantly outperform state-of-the-art statistical learning methods for math word problem solving.
Annotating Derivations: A New Evaluation Strategy and Dataset for Algebra Word Problems
TLDR
A new evaluation for automatic solvers for algebra word problems is proposed, which can identify mistakes that existing evaluations overlook, and derivation annotations can be semi-automatically added to existing datasets.
Learning to Automatically Solve Algebra Word Problems
TLDR
An approach for automatically learning to solve algebra word problems by reasons across sentence boundaries to construct and solve a system of linear equations, while simultaneously recovering an alignment of the variables and numbers to the problem text.
...
...