# Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge

@article{Serafini2016LogicTN, title={Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge}, author={Luciano Serafini and Artur S. d'Avila Garcez}, journal={ArXiv}, year={2016}, volume={abs/1606.04422} }

We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning. [] Key Method We show how Real Logic can be implemented in deep Tensor Neural Networks with the use of Google's tensorflow primitives. The paper concludes with experiments applying Logic Tensor Networks on a simple but representative example of knowledge completion.

## 178 Citations

### Extending Real Logic with Aggregate Functions

- Computer ScienceNeSy
- 2021

The resulting framework combines strengths of descriptive statistics modeled by fuzzy predicates, FOL to write complex queries and formulas, and SQL-like expressiveness to aggregate insights from data tables, and the formalization of such functions within Real Logic.

### DeepLogic: End-to-End Logical Reasoning

- Computer ScienceArXiv
- 2018

This article defines 12 classes of logic programs that exemplify increased level of complexity of the inference process (multi-hop and default reasoning) and shows that the proposed Neural Inference Network (NIN) passes 10 out of the 12 tasks.

### DeepLogic: Towards End-to-End Differentiable Logical Reasoning

- Computer ScienceAAAI Spring Symposium: Combining Machine Learning with Knowledge Engineering
- 2019

This paper explores how symbolic logic, defined as logic programs at a character level, is learned to be represented in a high-dimensional vector space using RNN-based iterative neural networks to perform reasoning.

### TensorLog: A Probabilistic Database Implemented Using Deep-Learning Infrastructure

- Computer ScienceJ. Artif. Intell. Res.
- 2020

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neuralnetwork infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.

### Logical Neural Networks

- Computer ScienceArXiv
- 2020

A novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning), which enables the open-world assumption by maintaining bounds on truth values which can have probabilistic semantics, yielding resilience to incomplete knowledge.

### Augmenting Deep Learning with Relational Knowledge from Markov Logic Networks

- Computer Science2020 IEEE International Conference on Big Data (Big Data)
- 2020

This paper develops a novel model that combines the best of both worlds, namely, the scalable learning capabilities of DNNs and symbolic knowledge specified in MLNs, and outperforms purely-MLN or purely-DNN based models in several different problem domains.

### Deep Adaptive Semantic Logic (DASL): Compiling Declarative Knowledge into Deep Neural Networks

- Computer ScienceArXiv
- 2020

Deep Adaptive Semantic Logic is introduced, a novel framework for automating the generation of deep neural networks that incorporates user-provided formal knowledge to improve learning from data and formal semantics are provided that demonstrate that this representation captures all of first order logic.

### TensorLog : Deep Learning Meets Probabilistic Databases

- Computer Science
- 2017

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neuralnetwork infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.

### TensorLog: Deep Learning Meets Probabilistic DBs

- Computer ScienceArXiv
- 2017

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neural-network infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.

## References

SHOWING 1-10 OF 38 REFERENCES

### Fast relational learning using bottom clause propositionalization with artificial neural networks

- Computer ScienceMachine Learning
- 2013

A fast method and system for relational learning based on a novel propositionalization called Bottom Clause Propositionalization (BCP) is introduced, which can achieve accuracy comparable to Aleph, and is extended to include a statistical feature selection method, mRMR, with preliminary results indicating that a reduction of more than 90 % of features can be achieved with a small loss of accuracy.

### Markov logic networks

- Computer ScienceMachine Learning
- 2006

Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach to combining first-order logic and probabilistic graphical models in a single representation.

### From machine learning to machine reasoning

- Computer ScienceMachine Learning
- 2013

Instead of trying to bridge the gap between machine learning systems and sophisticated “all-purpose” inference mechanisms, the set of manipulations applicable to training systems can be algebraically enriched, and reasoning capabilities from the ground up are built.

### Statistical Relational Artificial Intelligence: Logic, Probability, and Computation

- Computer ScienceStatistical Relational Artificial Intelligence: Logic, Probability, and Computation
- 2016

This book focuses on two representations in detail: Markov logic networks, a relational extension of undirected graphical models and weighted first-order predicate calculus formula, and Problog, a probabilistic extension of logic programs that can also be viewed as a Turing-complete relational extensions of Bayesian networks.

### Learning Deep Architectures for AI

- Computer ScienceFound. Trends Mach. Learn.
- 2007

The motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer modelssuch as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks are discussed.

### Learning Relational Sum-Product Networks

- Computer ScienceAAAI
- 2015

This paper introduces Relational Sum-Product Networks (RSPNs), a new tractable first-order probabilistic architecture that generalizes SPNs by modeling a set of instances jointly, allowing them to influence each other's probability distributions, as well as modeling probabilities of relations between objects.

### Hybrid Markov Logic Networks

- Computer ScienceAAAI
- 2008

Experiments in a mobile robot mapping domain--involving joint classification, clustering and regression--illustrate the power of hybrid MLNs as a modeling language, and the accuracy and efficiency of the inference algorithms.

### Meta-interpretive learning of higher-order dyadic datalog: predicate invention revisited

- Computer ScienceMachine Learning
- 2014

This paper shows that with an infinite signature the higher-order dyadic datalog class H22 has universal Turing expressivity though H^2_2$$H22 is decidable given a finite signature, and generalises the approach of meta-interpretive learning (MIL) to that of learning higher- order dyadicdatalog programs.

### Neural-Symbolic Learning and Reasoning (Dagstuhl Seminar 14381)

- Computer ScienceDagstuhl Reports
- 2014

The aim of the seminar was to explore the interface among several fields that contribute to the effective integration of cognitive abilities such as learning, reasoning, vision and language understanding in intelligent and cognitive computational systems.

### Robust logics

- Computer ScienceSTOC '99
- 1999

It is suggested that brittleness can be overcome by using a new kind of logic in which each statement is learnable, which enables the system to acquire a set of statements approximately consistent with each other and with the world, without the need for a globally knowledgeable and consistent p*OgrUllll~~.