InClass nets: independent classifier networks for nonparametric estimation of conditional independence mixture models and unsupervised classification

  title={InClass nets: independent classifier networks for nonparametric estimation of conditional independence mixture models and unsupervised classification},
  author={Konstantin T. Matchev and Prasanth Shyamsundar},
  journal={Machine Learning: Science and Technology},
Conditional independence mixture models (CIMMs) are an important class of statistical models used in many fields of science. We introduce a novel unsupervised machine learning technique called the independent classifier networks (InClass nets) technique for the nonparameteric estimation of CIMMs. InClass nets consist of multiple independent classifier neural networks (NNs), which are trained simultaneously using suitable cost functions. Leveraging the ability of NNs to handle high-dimensional… 



Semi-Parametric Estimation for Conditional Independence Multivariate Finite Mixture Models

The conditional independence assumption for nonparametric multivariate finite mixture models, a weaker form of the well-known conditional independence assumption for random effects models for

Approximating Likelihood Ratios with Calibrated Discriminative Classifiers

It is shown that likelihood ratios are invariant under a specific class of dimensionality reduction maps, and that discriminative classifiers can be used to approximate the generalized likelihood ratio statistic when only a generative model for the data is available.

Nonparametric mixture models with conditionally independent multivariate component densities

An EM-Like Algorithm for Semi- and Nonparametric Estimation in Multivariate Mixtures

An algorithm for nonparametric estimation for finite mixtures of multivariate random vectors that strongly resembles a true EM algorithm, which is much more flexible and easily applicable than existing algorithms in the literature and yields much smaller mean integrated squared errors than an alternative algorithm in a simulation study.

Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)

The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics, and includes detailed algorithms for supervised-learning problem for both regression and classification.

Estimation of the number of components of nonparametric multivariate finite mixture models

We propose a novel estimator for the number of components (denoted by $M$) in a K-variate non-parametric finite mixture model, where the analyst has repeated observations of $K\geq2$ variables that

Identifiability of parameters in latent structure models with many observed variables

A general approach for establishing identifiability utilizing algebraic arguments is demonstrated, which sheds light on the properties of finite mixtures of Bernoulli products, which have been used for decades despite being known to have nonidentifiable parameters.


Suppose k-variate data are drawn from a mixture of two distributions, each having independent components. It is desired to estimate the univariate marginal distributions in each of the products, as

Experiments using machine learning to approximate likelihood ratios for mixture models

This work demonstrates how the results can be considerably improved by decomposing the ratio and use a set of classifiers in a pairwise manner on the components of the mixture model and how this can be used to estimate the unknown coefficients of the model, such as the signal contribution.

Theoretical grounding for estimation in conditional independence multivariate finite mixture models

ABSTRACT For the nonparametric estimation of multivariate finite mixture models with the conditional independence assumption, we propose a new formulation of the objective function in terms of