• Corpus ID: 1526103

Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge

@article{Serafini2016LogicTN,
  title={Logic Tensor Networks: Deep Learning and Logical Reasoning from Data and Knowledge},
  author={Luciano Serafini and Artur S. d'Avila Garcez},
  journal={ArXiv},
  year={2016},
  volume={abs/1606.04422}
}
We propose Logic Tensor Networks: a uniform framework for integrating automatic learning and reasoning. [] Key Method We show how Real Logic can be implemented in deep Tensor Neural Networks with the use of Google's tensorflow primitives. The paper concludes with experiments applying Logic Tensor Networks on a simple but representative example of knowledge completion.

Figures and Tables from this paper

Logic Tensor Networks

Extending Real Logic with Aggregate Functions

The resulting framework combines strengths of descriptive statistics modeled by fuzzy predicates, FOL to write complex queries and formulas, and SQL-like expressiveness to aggregate insights from data tables, and the formalization of such functions within Real Logic.

DeepLogic: End-to-End Logical Reasoning

This article defines 12 classes of logic programs that exemplify increased level of complexity of the inference process (multi-hop and default reasoning) and shows that the proposed Neural Inference Network (NIN) passes 10 out of the 12 tasks.

DeepLogic: Towards End-to-End Differentiable Logical Reasoning

This paper explores how symbolic logic, defined as logic programs at a character level, is learned to be represented in a high-dimensional vector space using RNN-based iterative neural networks to perform reasoning.

TensorLog: A Probabilistic Database Implemented Using Deep-Learning Infrastructure

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neuralnetwork infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.

Logical Neural Networks

A novel framework seamlessly providing key properties of both neural nets (learning) and symbolic logic (knowledge and reasoning), which enables the open-world assumption by maintaining bounds on truth values which can have probabilistic semantics, yielding resilience to incomplete knowledge.

Augmenting Deep Learning with Relational Knowledge from Markov Logic Networks

This paper develops a novel model that combines the best of both worlds, namely, the scalable learning capabilities of DNNs and symbolic knowledge specified in MLNs, and outperforms purely-MLN or purely-DNN based models in several different problem domains.

Deep Adaptive Semantic Logic (DASL): Compiling Declarative Knowledge into Deep Neural Networks

Deep Adaptive Semantic Logic is introduced, a novel framework for automating the generation of deep neural networks that incorporates user-provided formal knowledge to improve learning from data and formal semantics are provided that demonstrate that this representation captures all of first order logic.

TensorLog : Deep Learning Meets Probabilistic Databases

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neuralnetwork infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.

TensorLog: Deep Learning Meets Probabilistic DBs

An implementation of a probabilistic first-order logic called TensorLog, in which classes of logical queries are compiled into differentiable functions in a neural-network infrastructure such as Tensorflow or Theano, which enables high-performance deep learning frameworks to be used for tuning the parameters of a Probabilistic logic.
...

References

SHOWING 1-10 OF 38 REFERENCES

Fast relational learning using bottom clause propositionalization with artificial neural networks

A fast method and system for relational learning based on a novel propositionalization called Bottom Clause Propositionalization (BCP) is introduced, which can achieve accuracy comparable to Aleph, and is extended to include a statistical feature selection method, mRMR, with preliminary results indicating that a reduction of more than 90 % of features can be achieved with a small loss of accuracy.

Markov logic networks

Experiments with a real-world database and knowledge base in a university domain illustrate the promise of this approach to combining first-order logic and probabilistic graphical models in a single representation.

From machine learning to machine reasoning

  • L. Bottou
  • Computer Science
    Machine Learning
  • 2013
Instead of trying to bridge the gap between machine learning systems and sophisticated “all-purpose” inference mechanisms, the set of manipulations applicable to training systems can be algebraically enriched, and reasoning capabilities from the ground up are built.

Statistical Relational Artificial Intelligence: Logic, Probability, and Computation

This book focuses on two representations in detail: Markov logic networks, a relational extension of undirected graphical models and weighted first-order predicate calculus formula, and Problog, a probabilistic extension of logic programs that can also be viewed as a Turing-complete relational extensions of Bayesian networks.

Learning Deep Architectures for AI

The motivations and principles regarding learning algorithms for deep architectures, in particular those exploiting as building blocks unsupervised learning of single-layer modelssuch as Restricted Boltzmann Machines, used to construct deeper models such as Deep Belief Networks are discussed.

Learning Relational Sum-Product Networks

This paper introduces Relational Sum-Product Networks (RSPNs), a new tractable first-order probabilistic architecture that generalizes SPNs by modeling a set of instances jointly, allowing them to influence each other's probability distributions, as well as modeling probabilities of relations between objects.

Hybrid Markov Logic Networks

Experiments in a mobile robot mapping domain--involving joint classification, clustering and regression--illustrate the power of hybrid MLNs as a modeling language, and the accuracy and efficiency of the inference algorithms.

Meta-interpretive learning of higher-order dyadic datalog: predicate invention revisited

This paper shows that with an infinite signature the higher-order dyadic datalog class H22 has universal Turing expressivity though H^2_2$$H22 is decidable given a finite signature, and generalises the approach of meta-interpretive learning (MIL) to that of learning higher- order dyadicdatalog programs.

Neural-Symbolic Learning and Reasoning (Dagstuhl Seminar 14381)

The aim of the seminar was to explore the interface among several fields that contribute to the effective integration of cognitive abilities such as learning, reasoning, vision and language understanding in intelligent and cognitive computational systems.

Robust logics

It is suggested that brittleness can be overcome by using a new kind of logic in which each statement is learnable, which enables the system to acquire a set of statements approximately consistent with each other and with the world, without the need for a globally knowledgeable and consistent p*OgrUllll~~.