• Corpus ID: 244102876

# Multi-task Learning for Compositional Data via Sparse Network Lasso

@inproceedings{Okazaki2021MultitaskLF,
title={Multi-task Learning for Compositional Data via Sparse Network Lasso},
author={Akira Okazaki and Shuichi Kawano},
year={2021}
}
• Published 12 November 2021
• Computer Science
A network lasso enables us to construct a model for each sample, which is known as multi-task learning. Existing methods for multi-task learning cannot be applied to compositional data due to their intrinsic properties. In this paper, we propose a multi-task learning method for compositional data using a sparse network lasso. We focus on a symmetric form of the log-contrast model, which is a regression model with compositional covariates. The effectiveness of the proposed method is shown…
1 Citations

## Figures and Tables from this paper

### Multi-Task Personalized Learning with Sparse Network Lasso

• Computer Science
IJCAI
• 2022
This paper proposes a novel multi-task personalized learning method that develops an alternating algorithm to solve the proposed optimization problem, and extensive experiments on various synthetic and real-world datasets demonstrate its robustness and effectiveness.

## References

SHOWING 1-10 OF 30 REFERENCES

### Variable selection in regression with compositional covariates

• Computer Science
• 2014
An l1 regularization method for the linear log-contrast model that respects the unique features of compositional data is proposed and its usefulness is illustrated by an application to a microbiome study relating human body mass index to gut microbiome composition.

### Regression Models for Compositional Data: General Log-Contrast Formulations, Proximal Optimization, and Microbiome Data Applications

• Computer Science
Statistics in Biosciences
• 2020
A general convex optimization model for linear log-contrast regression which includes many previous proposals as special cases is proposed and a proximal algorithm is introduced that solves the resulting constrained optimization problem exactly with rigorous convergence guarantees.

### FORMULA: FactORized MUlti-task LeArning for task discovery in personalized medical models

• Computer Science
SDM
• 2015
A novel approach called FactORized MUlti-task LeArning model (Formula), which learns the personalized model of each patient via a sparse multi-task learning method, which delivered superior predictive performance while the personalized models offered many useful medical insights.

### Localized Lasso for High-Dimensional Regression

• Computer Science
AISTATS
• 2017
The localized Lasso is introduced, which is suited for learning models that are both interpretable and have a high predictive power in problems with high dimensionality and small sample size, and a simple yet efficient iterative least-squares based optimization procedure is proposed.

### Exclusive Feature Learning on Arbitrary Structures via \ell_{1, 2}-norm

• Computer Science
NIPS
• 2014
This paper proposes a new formulation of exclusive group LASSO, which brings out sparsity at intra-group level in the context of feature selection, and proposes an effective iteratively re-weighted algorithm to solve the corresponding optimization problem with rigorous convergence analysis.

### Network Lasso: Clustering and Optimization in Large Graphs

• Computer Science
KDD
• 2015
The network lasso is introduced, a generalization of the group lasso to a network setting that allows for simultaneous clustering and optimization on graphs and an algorithm based on the Alternating Direction Method of Multipliers (ADMM) to solve this problem in a distributed and scalable manner.

### Personalized regression enables sample-specific pan-cancer analysis

• Computer Science
bioRxiv
• 2018
A novel regularizer is proposed for achieving patient-specific personalized estimation that uncovers sample-specific aberrations that are overlooked by population level methods, suggesting a promising new path for precision analysis of complex diseases such as cancer.

• Computer Science
Machine Learning
• 2007
It is proved that the method for learning sparse representations shared across multiple tasks is equivalent to solving a convex optimization problem for which there is an iterative algorithm which converges to an optimal solution.

### Tree-aggregated predictive modeling of microbiome data

• Biology, Computer Science
bioRxiv
• 2020
A data-driven, parameter-free, and scalable tree-guided aggregation framework to associate microbial subcompositions with response variables of interest and posit that the inferred aggregation levels provide highly interpretable taxon groupings that can help microbial ecologists gain insights into the structure and functioning of the underlying ecosystem of interest.

### Regression Analysis for Microbiome Compositional Data

• Mathematics
• 2016
One important problem in microbiome analysis is to identify the bacterial taxa that are associated with a response, where the microbiome data are summarized as the composition of the bacterial taxa