#### Filter Results:

- Full text PDF available (69)

#### Publication Year

1970

2017

- This year (3)
- Last 5 years (23)
- Last 10 years (46)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Key Phrases

Learn More

The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a… (More)

Microarrays are a novel technology that facilitates the simultaneous measurement of thousands of gene expression levels. A typical microarray experiment can produce millions of data points, raising serious problems of data reduction, and simultaneous inference. We consider one such experiment in which oligonucleotide arrays were employed to assess the… (More)

Current scienti c techniques in genomics and image processing routinely produce hypothesis testing problems with hundreds or thousands of cases to consider simultaneously. This poses new dif culties for the statistician, but also opens new opportunities. In particular, it allows empirical estimation of an appropriate null hypothesis. The empirical null… (More)

This paper discusses the problem of identifying differentially expressed groups of genes from a microarray experiment. The groups of genes are externally defined, for example, sets of gene pathways derived from biological databases. Our starting point is the interesting Gene Set Enrichment Analysis (GSEA) procedure of Subramanian et al. (2005). We study the… (More)

- Bradley Efron, Robert Tibshirani, Bradley EFRONand, Robert Tibshirani
- 2007

Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in… (More)

- Bradley Efron, Robert Tibshirani
- Genetic epidemiology
- 2002

In a classic two-sample problem, one might use Wilcoxon's statistic to test for a difference between treatment and control subjects. The analogous microarray experiment yields thousands of Wilcoxon statistics, one for each gene on the array, and confronts the statistician with a difficult simultaneous inference situation. We will discuss two inferential… (More)

This article surveys bootstrap methods for producing good approximate confidence intervals. The goal is to improve by an order of magnitude upon the accuracy of the standard intervals θ̂ ± zασ̂ , in a way that allows routine application even to very complicated problems. Both theory and examples are used to show how this is done. The first seven sections… (More)

- Bradley Efron
- 2006

Large-scale hypothesis testing problems, with hundreds or thousands of test statistics “zi” to consider at once, have become familiar in current practice. Applications of popular analysis methods such as false discovery rate techniques do not require independence of the zi’s, but their accuracy can be compromised in high-correlation situations. This paper… (More)

- Gail D Gong, A Leisurely, Bradley Efron, Gail D Gong
- 2010

- Bradley Efron
- 2006

Modern scientific technology has provided a new class of largescale simultaneous inference problems, with thousands of hypothesis tests to consider at the same time. Microarrays epitomize this type of technology, but similar situations arise in proteomics, spectroscopy, imaging, and social science surveys. This paper uses false discovery rate methods to… (More)