#### Filter Results:

- Full text PDF available (38)

#### Publication Year

2004

2017

- This year (1)
- Last 5 years (17)
- Last 10 years (32)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a computationally attractive alternative to standard… (More)

The Lasso (Tibshirani, 1996) is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables p is potentially much larger than the number of samples n. However, it was recently discovered (Zhao and Yu, 2006; Zou, 2005; Meinshausen and Bühlmann, 2006) that the sparsity pattern of the… (More)

- Malte Meinshausen, Nicolai Meinshausen, +5 authors Myles R Allen
- Nature
- 2009

More than 100 countries have adopted a global warming limit of 2 degrees C or below (relative to pre-industrial levels) as a guiding principle for mitigation efforts to reduce climate change risks, impacts and damages. However, the greenhouse gas (GHG) emissions corresponding to a specified maximum warming are poorly known owing to uncertainties in the… (More)

- Nicolai Meinshausen
- Journal of Machine Learning Research
- 2006

Abstract Random Forests were introduced as a Machine Learning tool in Breiman (2001) and have since proven to be very popular and powerful for high-dimensional regression and classification. For regression, Random Forests give an accurate approximation of the conditional mean of a response variable. It is shown here that Random Forests provide information… (More)

- Myles R Allen, David J Frame, +4 authors Nicolai Meinshausen
- Nature
- 2009

Global efforts to mitigate climate change are guided by projections of future temperatures. But the eventual equilibrium global mean temperature associated with a given stabilization level of atmospheric greenhouse gas concentrations remains uncertain, complicating the setting of stabilization targets to avoid potentially dangerous levels of global warming.… (More)

- Nicolai Meinshausen
- Computational Statistics & Data Analysis
- 2007

The Lasso is an attractive regularisation method for high dimensional regression. It combines variable selection with an efficient computational procedure. However, the rate of convergence of the Lasso is slow for some sparse high dimensional data, where the number of predictor variables is growing fast with the number of observations. Moreover, many noise… (More)

Assigning significance in high-dimensional regression is challenging. Most computationally efficient selection algorithms cannot guard against inclusion of noise variables. Asymptotically valid p-values are not available. An exception is a recent proposal by Wasserman and Roeder (2008) which splits the data into two parts. The number of variables is then… (More)

We discuss Monte Carlo methods for valuing options with multiple exercise features in discrete time. By extending the recently developed duality ideas for American option pricing we show how to obtain estimates on the prices of such options using Monte Carlo techniques. We prove convergence of our approach and estimate the error. The methods are applied to… (More)

Inspired by the success of the Lasso for regression analysis (Tibshirani, 1996), it seems attractive to estimate the graph of a multivariate normal distribution by `1-norm penalised likelihood maximisation. The objective function is convex and the graph estimator can thus be computed efficiently, even for very large graphs. However, we show in this note… (More)

A frequently encountered challenge in high-dimensional regression is the detection of relevant variables. Variable selection suffers from instability and the power to detect relevant variables is typically low if predictor variables are highly correlated. When taking the multiplicity of the testing problem into account, the power diminishes even further. To… (More)