Tree-Guided Group Lasso for Multi-Task Regression with Structured Sparsity

Abstract

We consider the problem of learning a sparse multi-task regression, where the structure in the outputs can be represented as a tree with leaf nodes as outputs and internal nodes as clusters of the outputs at multiple granularity. Our goal is to recover the common set of relevant inputs for each output cluster. Assuming that the tree structure is available as prior knowledge, we formulate this problem as a new multi-task regularized regression called tree-guided group lasso. Our structured regularization is based on a grouplasso penalty, where groups are defined with respect to the tree structure. We describe a systematic weighting scheme for the groups in the penalty such that each output variable is penalized in a balanced manner even if the groups overlap. We present an efficient optimization method that can handle a largescale problem. Using simulated and yeast datasets, we demonstrate that our method shows a superior performance in terms of both prediction errors and recovery of true sparsity patterns compared to other methods for multi-task learning.

Extracted Key Phrases

7 Figures and Tables

0204060200920102011201220132014201520162017
Citations per Year

341 Citations

Semantic Scholar estimates that this publication has 341 citations based on the available data.

See our FAQ for additional information.

Cite this paper

@inproceedings{Kim2010TreeGuidedGL, title={Tree-Guided Group Lasso for Multi-Task Regression with Structured Sparsity}, author={Seyoung Kim and Eric P. Xing}, booktitle={ICML}, year={2010} }