Corpus ID: 229363667

Self-supervised self-supervision by combining deep learning and probabilistic logic

@article{Lang2021SelfsupervisedSB,
  title={Self-supervised self-supervision by combining deep learning and probabilistic logic},
  author={Hunter Lang and Hoifung Poon},
  journal={ArXiv},
  year={2021},
  volume={abs/2012.12474}
}
Labeling training examples at scale is a perennial challenge in machine learning. Self-supervision methods compensate for the lack of direct supervision by leveraging prior knowledge to automatically generate noisy labeled examples. Deep probabilistic logic (DPL) is a unifying framework for selfsupervised learning that represents unknown labels as latent variables and incorporates diverse self-supervision using probabilistic logic to train a deep neural network end-toend using variational EM… Expand

Figures and Tables from this paper

References

SHOWING 1-10 OF 51 REFERENCES
Deep Probabilistic Logic: A Unifying Framework for Indirect Supervision
Training Complex Models with Multi-Task Weak Supervision
Snuba: Automating Weak Supervision to Label Training Data
Guiding Semi-Supervision with Constraint-Driven Learning
Inferring Generative Model Structure with Static Analysis
Learning the Structure of Generative Models without Labeled Data
Data Programming: Creating Large Training Sets, Quickly
Adaptive Rule Discovery for Labeling Text Data
Semi-supervised Learning with Deep Generative Models
...
1
2
3
4
5
...