Teaching Temporal Logics to Neural Networks
@article{Finkbeiner2020TeachingTL, title={Teaching Temporal Logics to Neural Networks}, author={Bernd Finkbeiner and Christopher Hahn and Markus N. Rabe and Frederik Schmitt}, journal={ArXiv}, year={2020}, volume={abs/2003.04218} }
We show that a deep neural network can learn the semantics of linear-time temporal logic (LTL). As a challenging task that requires deep understanding of the LTL semantics, we show that our network can solve the trace generation problem for LTL: given a satisfiable LTL formula, find a trace that satisfies the formula. We frame the trace generation problem for LTL as a translation task, i.e., to translate from formulas to satisfying traces, and train an off-the-shelf implementation of the…
Figures and Tables from this paper
29 Citations
Proof Artifact Co-training for Theorem Proving with Language Models
- Computer ScienceICLR
- 2022
PACT is proposed, a general methodology for extracting abundant self-supervised data from kernel-level proof terms for co-training alongside the usual tactic prediction objective and applied to Lean, an interactive proof assistant which hosts some of the most sophisticated formalized mathematics to date.
Mathematical Reasoning via Self-supervised Skip-tree Training
- Computer ScienceICLR
- 2021
It is found that models trained on the skip-tree task show surprisingly strong mathematical reasoning abilities, and outperform modelstrained on standard skip-sequence tasks.
Conversational Neuro-Symbolic Commonsense Reasoning
- Computer ScienceAAAI
- 2021
An interactive conversational framework built on the authors' neuro-symbolic system, that conversationally evokes commonsense knowledge from humans to complete its reasoning chains, and presents a neuro-Symbolic theorem prover that extracts multi-hop reasoning chains for this problem.
Checking LTL Satisfiability via End-to-end Learning
- Computer ScienceASE
- 2022
This paper trains different neural networks for keeping three logical properties of LTL, i.e., recursive property, permutation invariance, and sequentiality, and demonstrates that neural networks can indeed capture some effective biases for checking LTL satisfiability.
Deep Learning for Temporal Logics
- Computer Science
- 2021
This work reports on current advances in applying deep learning to temporal logical reasoning tasks, showing that models can even solve instances where competitive classical algorithms timed out.
Neural Circuit Synthesis from Specification Patterns
- Computer ScienceNeurIPS
- 2021
This paper considers a method to generate large amounts of additional training data, i.e., pairs of specifications and circuits implementing them, and shows that hierarchical Transformers trained on this synthetic data solve aSignificant portion of problems from the synthesis competitions, and even out-of-distribution examples from a recent case study.
Towards the Automatic Mathematician
- Computer ScienceCADE
- 2021
This extended abstract summarizes recent developments of machine learning in mathematical reasoning and the vision of the N2Formal group at Google Research to create an automatic mathematician.
Neural Combinatorial Logic Circuit Synthesis from Input-Output Examples
- Computer ScienceArXiv
- 2022
A novel, fully explainable neural approach to synthesis of combinatorial logic circuits from input-output examples, which succeeds in learning a number of arithmetic, bitwise, and signal-routing operations, and hints at a wider promise for synthesis and reasoning-related tasks.
FOLIO: Natural Language Reasoning with First-Order Logic
- Computer ScienceArXiv
- 2022
The results show that one of the most capable Large Language Model (LLM) publicly available, GPT-3 davinci, achieves only slightly better than random results with few-shot prompting on a subset of FOLIO, and the model is especially bad at predicting the correct truth values for False and Unknown conclusions.
OCTAL: Graph Representation Learning for LTL Model Checking
- Computer ScienceArXiv
- 2022
A novel GRL-based framework OCTAL, is designed to learn the representation of the graph-structured system and specification, which reduces the model checking problem to binary classification in the latent space.
References
SHOWING 1-10 OF 52 REFERENCES
Learning to Solve SMT Formulas
- Computer ScienceNeurIPS
- 2018
This work phrases the challenge of solving SMT formulas as a tree search problem where at each step a transformation is applied to the input formula until the formula is solved, and synthesizes a strategy in the form of a loop-free program with branches to guide the SMT solver to decide formulas more efficiently.
NeuroCore: Guiding High-Performance SAT Solvers with Unsat-Core Predictions
- Computer ScienceSAT
- 2019
This work trains a simplified NeuroSAT architecture to directly predict the unsatisfiable cores of real problems and modify several state-of-the-art SAT solvers to periodically replace their variable activity scores with neuroSAT’s prediction of how likely the variables are to appear in an unsatisfiable core.
Learning to Represent Programs with Graphs
- Computer ScienceICLR
- 2018
This work proposes to use graphs to represent both the syntactic and semantic structure of code and use graph-based deep learning methods to learn to reason over program structures, and suggests that these models learn to infer meaningful names and to solve the VarMisuse task in many cases.
Attention is All you Need
- Computer ScienceNIPS
- 2017
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.
Deep Network Guided Proof Search
- Computer ScienceLPAR
- 2017
Experimental evidence is given that with a hybrid, two-phase approach, deep learning based guidance can significantly reduce the average number of proof search steps while increasing the number of theorems proved.
Dynamic Neural Program Embedding for Program Repair
- Computer ScienceICLR
- 2018
A novel semantic program embedding that is learned from program execution traces is proposed, showing that program states expressed as sequential tuples of live variable values not only captures program semantics more precisely, but also offer a more natural fit for Recurrent Neural Networks to model.
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- Computer ScienceNAACL
- 2019
A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks.
Mathematical Reasoning in Latent Space
- Computer ScienceICLR
- 2020
The experiments show that graph neural networks can make non-trivial predictions about the rewrite-success of statements, even when they propagate predicted latent representations for several steps, a strong indicator for the feasibility of deduction in latent space in general.
Graph Representations for Higher-Order Logic and Theorem Proving
- Computer ScienceAAAI
- 2020
This paper presents the first use of graph neural networks (GNNs) for higher-order proof search and demonstrates that GNNs can improve upon state-of-the-art results in this domain. Interactive,…
Analysing Mathematical Reasoning Abilities of Neural Models
- Computer ScienceICLR
- 2019
This paper conducts a comprehensive analysis of models from two broad classes of the most powerful sequence-to-sequence architectures and finds notable differences in their ability to resolve mathematical problems and generalize their knowledge.