• Publications
  • Influence
Random Sum-Product Networks: A Simple and Effective Approach to Probabilistic Deep Learning
This work follows a simple “deep learning” approach, by generating unspecialized random structures, scalable to millions of parameters, and subsequently applying GPUbased optimization, which yields well-calibrated uncertainties and stands out among most deep generative and discriminative models in being robust to missing features and being able to detect anomalies. Expand
SPFlow: An Easy and Extensible Library for Deep Probabilistic Learning using Sum-Product Networks
We introduce SPFlow, an open-source Python library providing a simple interface to inference, learning and manipulation routines for deep and tractable probabilistic models called Sum-ProductExpand
Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits
This paper proposes EiNets, a novel implementation design for PCs that combines a large number of arithmetic operations in a single monolithic einsum-operation, leading to speedups and memory savings of up to two orders of magnitude, in comparison to previous implementations. Expand
Structured Object-Aware Physics Prediction for Video Modeling and Planning
STOVE is presented, a novel state-space model for videos, which explicitly reasons about objects and their positions, velocities, and interactions, and outperforms previous unsupervised models, and even approaches the performance of supervised baselines. Expand
Probabilistic Deep Learning using Random Sum-Product Networks
This paper makes a drastic simplification and uses random SPN structures which are trained in a "classical deep learning manner", i.e. employing automatic differentiation, SGD, and GPU support, and yields prediction results comparable to deep neural networks, while still being interpretable as generative model and maintaining well-calibrated uncertainties. Expand
Faster Attend-Infer-Repeat with Tractable Probabilistic Models
Excessive overlap between objects is discouraged via an unnormalized penalty term on p(z where), modelled as a Gamma distribution with α = 1, β = 120 over each object’s occlusion ratio, the ratio of its pixels which is occluded and will thus be marginalized. Expand
Conditional Sum-Product Networks: Imposing Structure on Deep Probabilistic Architectures
The notion of SPNs is extended towards conditional distributions, which combine simple conditional models into high-dimensional ones, and can be naturally used to impose structure on deep probabilistic models, allow for mixed data types, while maintaining fast and efficient inference. Expand
Decomposing 3D Scenes into Objects via Unsupervised Volume Segmentation
We present ObSuRF, a method which turns a single image of a scene into a 3D model represented as a set of Neural Radiance Fields (NeRFs), with each NeRF corresponding to a different object. A singleExpand
Residual Sum-Product Networks
This paper presents a residual learning approach to ease the learning of SPNs, which are deeper and wider than those used previously, and introduces an iterative pruning technique that compacts models and yields better generalization. Expand
Random Sum-Product Forests with Residual Links
This paper presents random sum-product forests (RSPFs), an ensemble approach for mixing multiple randomly generated SPNs, and introduces residual links, which reference specialized substructures of other component SPNs in order to leverage the context-specific knowledge encoded within them. Expand