• Corpus ID: 4864964

Sparse Linear Isotonic Models

@inproceedings{Chen2018SparseLI,
  title={Sparse Linear Isotonic Models},
  author={Sheng Chen and Arindam Banerjee},
  booktitle={AISTATS},
  year={2018}
}
In machine learning and data mining, linear models have been widely used to model the response as parametric linear functions of the predictors. To relax such stringent assumptions made by parametric linear models, additive models consider the response to be a summation of unknown transformations applied on the predictors; in particular, additive isotonic models (AIMs) assume the unknown transformations to be monotone. In this paper, we introduce sparse linear isotonic models (SLIMs) for… 

Figures and Tables from this paper

Learning Nonlinear Mixtures: Identifiability and Algorithm

TLDR
This work proposes an identification criterion for a nonlinear mixture model that is well grounded in many real-world applications, and offers identifiability guarantees, and a practical implementation based on a judiciously designed neural network is proposed to realize the criterion.

Nonsmooth Sparsity Constrained Optimization via Penalty Alternating Direction Methods

TLDR
This paper revisits the Penalty Alternating Direction Method (PADM) for nonsmooth sparsity constrained optimization problems and shows that the PADM-BCD algorithm finds stronger stationary points of the optimization problem than previous methods.

Unsupervised Learning of Latent Structure from Linear and Nonlinear Measurements

TLDR
This dissertation presents a meta-modelling system that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of systematically cataloging and cataloging individual components of a system.

References

SHOWING 1-10 OF 38 REFERENCES

Sparse additive models

TLDR
An algorithm for fitting the models is derived that is practical and effective even when the number of covariates is larger than the sample size, and empirical results show that they can be effective in fitting sparse non‐parametric models in high dimensional data.

Semiparametric additive isotonic regression

LASSO Isotone for High-Dimensional Additive Isotonic Regression

TLDR
A new method for additive isotonic regression called LASSO Isotone (LISO), which adapts ideas from sparse linear modeling to additive isotonics regression and gives a numerical convergence result, and some of its properties are examined through simulations.

Additive Isotonic Models

TLDR
The isotonic transformations are chosen to minimize an explicit criterion, such as the negative log-likelihood, by an algorithm that optimizes one transformation at a time while adjusting for the current fitted values of the others, cycling until the criterion converges.

Regularized rank-based estimation of high-dimensional nonparanormal graphical models

TLDR
It is shown that the nonparanormal graphical model can be efficiently estimated by using a rank-based estimation scheme which does not require estimating these unknown transformation functions.

Additive isotone regression

This paper is about optimal estimation of the additive components of a nonparametric, additive isotone regression model. It is shown that asymp- totically up to first order, each additive component

The Nonparanormal: Semiparametric Estimation of High Dimensional Undirected Graphs

TLDR
A method is derived for estimating the nonparanormal, the method's theoretical properties are studied, and it is shown that it works well in many examples.

Estimation with Norm Regularization

TLDR
This paper characterize the restricted error set, establish relations between error sets for the constrained and regularized problems, and present an estimation error bound applicable to any norm.

High Dimensional Semiparametric Gaussian Copula Graphical Models

TLDR
It is proved that the nonparanormal skeptic achieves the optimal parametric rates of convergence for both graph recovery and parameter estimation, and this result suggests that the NonParanormal graphical models can be used as a safe replacement of the popular Gaussian graphical models, even when the data are truly Gaussian.

Transelliptical Graphical Models

TLDR
A nonparametric rank-based regularization estimator is proposed which achieves the parametric rates of convergence for both graph recovery and parameter estimation and suggests that the extra robustness and flexibility obtained by the semiparametric transelliptical modeling incurs almost no efficiency loss.