• Publications
  • Influence
From Variational to Deterministic Autoencoders
TLDR
It is shown, in a rigorous empirical study, that the proposed regularized deterministic autoencoders are able to generate samples that are comparable to, or better than, those of VAEs and more powerful alternatives when applied to images as well as to structured data such as molecules.
Simplifying, Regularizing and Strengthening Sum-Product Network Structure Learning
TLDR
This work enhances one of the best structure learner, LearnSPN, aiming to improve both the structural quality of the learned networks and their achieved likelihoods, and proves its claims by empirically evaluating the learned SPNs on several benchmark datasets against other competitive SPN and PGM structure learners.
Mixed Sum-Product Networks: A Deep Architecture for Hybrid Domains
TLDR
This work proposes the first trainable probabilistic deep architecture for hybrid domains that features tractable queries and relieves the user from deciding a-priori the parametric form of the random variables but is still expressive enough to effectively approximate any distribution and permits efficient learning and inference.
Random Sum-Product Networks: A Simple and Effective Approach to Probabilistic Deep Learning
TLDR
This work follows a simple “deep learning” approach, by generating unspecialized random structures, scalable to millions of parameters, and subsequently applying GPUbased optimization, which yields well-calibrated uncertainties and stands out among most deep generative and discriminative models in being robust to missing features and being able to detect anomalies.
SPFlow: An Easy and Extensible Library for Deep Probabilistic Learning using Sum-Product Networks
We introduce SPFlow, an open-source Python library providing a simple interface to inference, learning and manipulation routines for deep and tractable probabilistic models called Sum-Product
Probabilistic Deep Learning using Random Sum-Product Networks
TLDR
This paper makes a drastic simplification and uses random SPN structures which are trained in a "classical deep learning manner", i.e. employing automatic differentiation, SGD, and GPU support, and yields prediction results comparable to deep neural networks, while still being interpretable as generative model and maintaining well-calibrated uncertainties.
Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic Circuits
TLDR
This paper proposes EiNets, a novel implementation design for PCs that combines a large number of arithmetic operations in a single monolithic einsum-operation, leading to speedups and memory savings of up to two orders of magnitude, in comparison to previous implementations.
Sum-Product Autoencoding: Encoding and Decoding Representations Using Sum-Product Networks
TLDR
The experimental results on several multilabel classification problems demonstrate that SPAE is competitive with state-of-the-art autoencoder architectures, even if the SPNs were never trained to reconstruct their inputs.
End-to-end Learning of Deep Spatio-temporal Representations for Satellite Image Time Series Classification
TLDR
This paper describes the first-place solution to the discovery challenge on time series land cover classification (TiSeLaC), organized in conjunction of ECML PKDD 2017, comprising modules using dense multi-layer perceptrons and one-dimensional convolutional neural networks.
Automatic Bayesian Density Analysis
TLDR
Automatic Bayesian Density Analysis (ABDA) allows for automatic and efficient missing value estimation, statistical data type and likelihood discovery, anomaly detection and dependency structure mining, on top of providing accurate density estimation.
...
...