Holistic Generalized Linear Models

@article{Schwendinger2022HolisticGL,
  title={Holistic Generalized Linear Models},
  author={Benjamin Schwendinger and Florian Schwendinger and Laura Vana},
  journal={ArXiv},
  year={2022},
  volume={abs/2205.15447}
}
Holistic linear regression extends the classical best subset selection problem by adding additional constraints designed to improve the model quality. These constraints include sparsity-inducing constraints, sign-coherence constraints and linear constraints. The R package holiglm provides functionality to model and fit holistic generalized linear models. By making use of state-of-the-art conic mixed-integer solvers, the package can reliably solve GLMs for Gaussian, binomial and Poisson… 

Figures and Tables from this paper

References

SHOWING 1-10 OF 35 REFERENCES

Integer constraints for enhancing interpretability in linear regression

The numerical experiments carried out on real and simulated datasets show that tightening the search space of some standard linear regression models by adding the constraints modelling (i) and/or (ii) help to improve the sparsity and interpretability of the solutions with competitive predictive quality.

Scalable holistic linear regression

An Algorithmic Approach to Linear Regression

This work presents an algorithmic approach in which the desirable properties are modeled as constraints and through penalties in the objective function of a Mixed Integer Quadratic Optimization (MIQO) model.

On the fitting of generalized linear models with nonnegativity parameter constraints

SUMMARY We consider the problem of finding maximum likelihood estimates of a generalized linear model when some or all of the regression parameters are constrained to be nonnegative. The Kuhn-Tucker

Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization

It is argued further that in specific cases, NNLS may have a better $\ell_{\infty}$-rate in estimation and hence also advantages with respect to support recovery when combined with thresholding, and from a practical point of view, NnLS does not depend on a regularization parameter and is hence easier to use.

Solving least squares problems

Since the lm function provides a lot of features it is rather complicated. So we are going to instead use the function lsfit as a model. It computes only the coefficient estimates and the residuals.

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

A comparison of optimization solvers for log binomial regression including conic programming

Conic optimizers emerge as the preferred choice due to their reliability, lack of requirement to tune hyperparameters and speed, which allows to identify and exclude problematic cases.

CVXR: An R Package for Disciplined Convex Optimization

This work provides an object-oriented modeling language for convex optimization, similar to CVX, CVXPY, YALMIP, and Convex.jl, and applies signed disciplined convex programming (DCP) to verify the problem's convexity.

ROI: An Extensible R Optimization Infrastructure

The R Optimization Infrastructure is introduced, which provides an extensible infrastructure to model linear, quadratic, conic and general nonlinear optimization problems in a consistent way and administers many different solvers, reformulations, problem collections and functions to read and write optimization Problems in various formats.