#### Filter Results:

- Full text PDF available (31)

#### Publication Year

1991

2017

- This year (2)
- Last 5 years (6)
- Last 10 years (27)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

- Jelle J. Goeman, Sara A. van de Geer, Floor de Kort, Hans C. van Houwelingen
- Bioinformatics
- 2004

MOTIVATION
This paper presents a global test to be used for the analysis of microarray data. Using this test it can be determined whether the global expression pattern of a group of genes is significantly related to some clinical outcome of interest. Groups of genes may be any size from a single gene to all genes on the chip (e.g. known pathways, specific… (More)

The group lasso is an extension of the lasso to do variable selection on (predefined) groups of variables in linear regression models. The estimates have the attractive property of being invariant under groupwise orthogonal reparameterizations. We extend the group lasso to logistic regression models and present an efficient algorithm, that is especially… (More)

We study the problem of estimating multiple linear regression equations for the purpose of both prediction and variable selection. Following recent work on multi-task learning Argyriou et al. [2008], we assume that the regression vectors share the same sparsity pattern. This means that the set of relevant predictor variables is the same across the different… (More)

We propose a new sparsity-smoothness penalty for high-dimensional generalized additive models. The combination of sparsity and smoothness is crucial for mathematical theory as well as performance for finite-sample data. We present a computationally efficient algorithm, with provable numerical convergence properties, for optimizing the penalized likelihood.… (More)

We consider the problem of estimating a sparse linear regression vector β∗ under a gaussian noise model, for the purpose of both prediction and model selection. We assume that prior knowledge is available on the sparsity pattern, namely the set of variables is partitioned into prescribed groups, only few of which are relevant in the estimation process. This… (More)

- Sara van de Geer
- 2007

We study high-dimensional generalized linear models and empirical risk minimization using the Lasso. An oracle inequality is presented, under a so called compatibility condition. Our aim is three fold: to proof a result announced in van de Geer (2007), to provide a simple proof with simple constants, and to separate the stochastic problem from the… (More)

We would like to begin by congratulating the authors on their fine paper. Handling highly correlated variables is one of the most important issues facing practitioners in high-dimensional regression problems, and in some ways it is surprising that it has not received more attention up to this point. The authors have made substantial progress towards… (More)

We propose an 1-penalized estimation procedure for high-dimensional linear mixedeffects models. The models are useful whenever there is a grouping structure among highdimensional observations, that is, for clustered data. We prove a consistency and an oracle optimality result and we develop an algorithm with provable numerical convergence. Furthermore, we… (More)

We show that the two-stage adaptive Lasso procedure (Zou, 2006) is consistent for high-dimensional model selection in linear and Gaussian graphical models. Our conditions for consistency cover more general situations than those accomplished in previous work: we prove that restricted eigenvalue conditions (Bickel et al., 2008) are also sufficient for sparse… (More)

I would like to congratulate the authors for this very interesting contribution. The generalization of `1-penalized linear regression to the “mixture-of-Gaussian-regressions” model raises some very interesting questions both from theoretical and algorithmic points of view and the paper offers a variety of powerful tools to attack both problems. In this… (More)