• Corpus ID: 52034442

Penalized polytomous ordinal logistic regression using cumulative logits. Application to network inference of zero-inflated variables

  title={Penalized polytomous ordinal logistic regression using cumulative logits. Application to network inference of zero-inflated variables},
  author={Aur'elie Deveau and Anne G'egout-Petit and Cl'emence Karmann},
  journal={arXiv: Applications},
We consider the problem of variable selection when the response is ordinal , that is an ordered categorical variable. In particular, we are interested in selecting quantitative explanatory variables linked with the ordinal response variable and we want to determine which predictors are relevant. In this framework, we choose to use the polytomous ordinal logistic regression model using cumulative logits which generalizes the logistic regression. We then introduce the Lasso estimation of the… 



Genome-wide association analysis by lasso penalized logistic regression

The performance of lasso penalized logistic regression in case-control disease gene mapping with a large number of SNPs (single nucleotide polymorphisms) predictors is evaluated and coeliac disease results replicate the previous SNP results and shed light on possible interactions among the SNPs.

The analysis of ordered categorical data: An overview and a survey of recent developments

This article review methodologies used for analyzing ordered categorical (ordinal) response variables. We begin by surveying models for data with a single ordinal response variable. We also survey

Regression Models for Ordinal Data

SUMMARY A general class of regression models for ordinal data is developed and discussed. These models utilize the ordinal nature of the data by describing various modes of stochastic ordering and

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.


This paper looks at the error rates and power of some multi-stage regression methods and considers three screening methods: the lasso, marginal regression, and forward stepwise regression.

A knockoff filter for high-dimensional selective inference

It is proved that the high-dimensional knockoff procedure 'discovers' important variables as well as the directions (signs) of their effects, in such a way that the expected proportion of wrongly chosen signs is below the user-specified level.

Ordinal Graphical Models: A Tale of Two Approaches

The theoretical developments allow us to provide correspondingly two classes of estimators that are not only computationally efficient but also have strong statistical guarantees.

High-dimensional graphs and variable selection with the Lasso

It is shown that neighborhood selection with the Lasso is a computationally attractive alternative to standard covariance selection for sparse high-dimensional graphs and is hence equivalent to variable selection for Gaussian linear models.

Regression, Discrimination and Measurement Models for Ordered Categorical Variables

Regression models for the analysis of ordered categorical variables are discussed with particular reference to the logistic model. Maximum likelihood estimation procedures are established for three

Stability Approach to Regularization Selection (StARS) for High Dimensional Graphical Models

The method has a clear interpretation: the authors use the least amount of regularization that simultaneously makes a graph sparse and replicable under random sampling, which requires essentially no conditions.