Semantic Probabilistic Layers for Neuro-Symbolic Learning

@article{Ahmed2022SemanticPL,
  title={Semantic Probabilistic Layers for Neuro-Symbolic Learning},
  author={Kareem Ahmed and Stefano Teso and Kai-Wei Chang and Guy Van den Broeck and Antonio Vergari},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.00426}
}
We design a predictive layer for structured-output prediction (SOP) that can be plugged into any neural network guaranteeing its predictions are consistent with a set of predefined symbolic constraints. Our S emantic P robabilistic L ayer (SPL) can model intricate correlations, and hard constraints, over a structured output space while be-ing amenable to end-to-end learning via maximum likelihood. SPLs combine exact probabilistic inference with logical reasoning in a clean and modular way… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 86 REFERENCES

A Semantic Loss Function for Deep Learning with Symbolic Knowledge

A semantic loss function is derived from first principles that bridges between neural output vectors and logical constraints and significantly increases the ability of the neural network to predict structured objects, such as rankings and paths.

Neuro-Symbolic Entropy Regularization

A loss is proposed, neuro-symbolic entropy regularization, that encourages the model to confidently predict a valid object and seamlessly integrates with other neuro-Symbolic losses that elimi-nate invalid predictions.

Learning and Inference for Structured Prediction: A Unifying Perspective

A unifying perspective of the different frameworks that address structured prediction problems and compare them in terms of their strengths and weaknesses is presented.

MultiplexNet: Towards Fully Satisfied Logical Constraints in Neural Networks

This work proposes a novel way to incorporate expert knowledge into the training of deep neural networks by representing domain knowledge as a quantifier-free logical formula in disjunctive normal form (DNF) which is easy to encode and to elicit from human experts.

PYLON: A PyTorch Framework for Learning with Constraints

This work introduces PYLON, a neuro-symbolic training framework that builds on PyTorch to augment procedurally trained models with declaratively specified knowledge, and lets users programmatically specify constraints as Python functions and compiles them into a differentiable loss, thus training predictive models that fit the data whilst satisfying the specified constraints.

DeepProbLog: Neural Probabilistic Logic Programming

This work is the first to propose a framework where general-purpose neural networks and expressive probabilistic-logical modeling and reasoning are integrated in a way that exploits the full expressiveness and strengths of both worlds and can be trained end-to-end based on examples.

Semantic-based regularization for learning and inference

Injecting Numerical Reasoning Skills into Language Models

This work shows that numerical reasoning is amenable to automatic data generation, and thus one can inject this skill into pre-trained LMs, by generating large amounts of data, and training in a multi-task setup.

Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures

The notion of SPNs is extended towards conditional distributions, which combine simple conditional models into high-dimensional ones, and can be naturally used to impose structure on deep probabilistic models, allow for mixed data types, while maintaining fast and efficient inference.

Bridging logic and kernel machines

A stage-based learning scheme, in which the participants start learning the supervised examples until convergence is reached, and then continue by forcing the logic clauses is a viable direction to attack the optimization of classic numerical schemes.
...