• Corpus ID: 51918599

A Cluster Elastic Net for Multivariate Regression

  title={A Cluster Elastic Net for Multivariate Regression},
  author={Bradley S. Price and Ben Sherwood},
  journal={J. Mach. Learn. Res.},
We propose a method for estimating coefficients in multivariate regression when there is a clustering structure to the response variables. The proposed method includes a fusion penalty, to shrink the difference in fitted values from responses in the same cluster, and an L1 penalty for simultaneous variable selection and estimation. The method can be used when the grouping structure of the response variables is known or unknown. When the clustering structure is unknown the method will… 

Wilcoxon-type Multivariate Cluster Elastic Net

We propose a method for high dimensional multivariate regression that is robust to random error distributions that are heavy-tailed or contain outliers, while preserving estimation accuracy in normal

On the Use of Minimum Penalties in Statistical Learning

Modern multivariate machine learning and statistical methodologies estimate parameters of interest while leveraging prior knowledge of the association between outcome variables. The methods that do

New insights for the multivariate square-root lasso

It is shown that the multivariate square-root lasso can outperform more computationally intensive methods that require explicit estimation of the error precision matrix and is the solution to a convex optimization problem.

Mixtures of multivariate generalized linear models with overlapping clusters

This work aims to define a mixture of generalized linear models with overlapping clusters of units, that involves crucially an overlap function, that maps the coefficients of the parent clusters into the coefficient of the multiple allocation units.

Estimating Multiple Precision Matrices with Cluster Fusion Regularization

This article proposes a penalized likelihood framework for estimating multiple precision matrices from different classes and proposes sparse and non-sparse estimators, both of which require solving a non-convex optimization problem.

Detecting clusters in multivariate response regression

Current state of the art methods are explored, a framework to better understand methods that utilize or detect clusters of responses are presented, and insights on the computational challenges associated with this framework are provided.

Smooth and locally sparse estimation for multiple-output functional linear regression

A functional linear regression model that explicitly incorporates the interconnections among the responses is considered that shows excellent numerical performance in terms of the estimation of coefficient functions especially the coefficient functions are same for all multivariate responses.

Estimating Multiple Precision Matrices With Cluster Fusion Regularization

This article proposes a penalized likelihood framework for estimating multiple precision matrices from different classes and proposesarse and nonsparse estimators, both of which require solving a nonconvex optimization problem.

Measurement of Economics to Scale in Corporates of Tehran Stock Exchange

One of the most important effective factors in the economic growth is the increased efficiency of manufacturing sectors. Thus, it is necessary to review and measure the efficiency of business units



The Cluster Elastic Net for High-Dimensional Regression With Unknown Variable Grouping

This work proposes the cluster elastic net, which selectively shrinks the coefficients for such variables toward each other, rather than toward the origin, in the high-dimensional regression setting.

Model selection and estimation in regression with grouped variables

Summary.  We consider the problem of selecting grouped variables (factors) for accurate prediction in regression. Such a problem arises naturally in many practical situations with the multifactor

Sparse Multivariate Regression With Covariance Estimation

  • Adam J. RothmanE. LevinaJi Zhu
  • Computer Science
    Journal of computational and graphical statistics : a joint publication of American Statistical Association, Institute of Mathematical Statistics, Interface Foundation of North America
  • 2010
It is shown that the proposed method outperforms relevant competitors when the responses are highly correlated, and an efficient optimization algorithm and a fast approximation are developed for computing MRCE.

The Sparse Laplacian Shrinkage Estimator for High-Dimensional Regression.

It is shown that the SLS possesses an oracle property in the sense that it is selection consistent and equal to the oracle Laplacian shrinkage estimator with high probability in sparse, high-dimensional settings with p ≫ n under reasonable conditions.

Regression Shrinkage and Selection via the Lasso

A new method for estimation in linear models called the lasso, which minimizes the residual sum of squares subject to the sum of the absolute value of the coefficients being less than a constant, is proposed.

A significance test for graph‐constrained estimation

A new inference framework is presented, called the Grace test, which produces coefficient estimates and corresponding p‐values by incorporating the external graph information and is shown to be asymptotically more powerful than similar tests that ignore the external information.

Indirect multivariate response linear regression

We propose a class of estimators of the multivariate response linear regression coefficient matrix that exploits the assumption that the response and predictors have a joint multivariate normal

Regularization and variable selection via the elastic net

It is shown that the elastic net often outperforms the lasso, while enjoying a similar sparsity of representation, and an algorithm called LARS‐EN is proposed for computing elastic net regularization paths efficiently, much like algorithm LARS does for the lamba.