# Weijie Su

• NIPS
• 2014
We derive a second-order ordinary differential equation (ODE), which is the limit<lb>of Nesterov’s accelerated gradient method. This ODE exhibits approximate equiv-<lb>alence to Nesterov’s scheme and thus can serve as a tool for analysis. We show that<lb>the continuous time ODE allows for a better understanding of Nesterov’s scheme.<lb>As a byproduct, we(More)
• The annals of applied statistics
• 2015
We introduce a new estimator for the vector of coefficients β in the linear model y = Xβ + z, where X has dimensions n × p with p possibly larger than n. SLOPE, short for Sorted L-One Penalized Estimation, is the solution to [Formula: see text]where λ1 ≥ λ2 ≥ … ≥ λ p ≥ 0 and [Formula: see text] are the decreasing absolute values of the entries of b. This is(More)
We introduce a novel method for sparse regression and variable selection, which is inspired by modern ideas in multiple testing. Imagine we have observations from the linear model y = Xβ+ z, then we suggest estimating the regression coefficients by means of a new estimator called SLOPE, which is the solution to minimize b 1 2‖y −Xb‖ 2 2 + λ1|b|(1) +(More)
• ArXiv
• 2015
<lb>We consider high-dimensional sparse regression problems in which we observe y = Xβ + z,<lb>where X is an n × p design matrix and z is an n-dimensional vector of independent Gaussian<lb>errors, each with variance σ. Our focus is on the recently introduced SLOPE estimator [15],<lb>which regularizes the least-squares estimates with the rank-dependent(More)
We introduce a novel method for sparse regression and variable selection, which is inspired by modern ideas in multiple testing. Imagine we have observations from the linear model y = Xβ+ z, then we suggest estimating the regression coefficients by means of a new estimator called the ordered lasso, which is the solution to minimize b 1 2‖y −Xb‖ 2 2 +(More)
• 2011 IEEE Consumer Communications and Networking…
• 2011
As deploying Vehicular Ad Hoc NETworks (VANETs) costs large amounts of resources, it is crucial that governments and companies make a thorough estimation and comparison of the benefits and the costs. The network connectivity is an important factor we should take care of, because it can greatly affect the performance of VANETs and further affect how much we(More)
• 2015
We present a novel method for controlling the k-familywise error rate (k-FWER) in the linear regression setting using the knockoffs framework first introduced by Barber and Candès. Our procedure, which we also refer to as knockoffs, can be applied with any design matrix with at least as many observations as variables, and does not require knowing the noise(More)
• ArXiv
• 2015
We provide the first differentially private algorithms for controlling the false discovery rate (FDR) in multiple hypothesis testing, with essentially no loss in power under certain conditions. Our general approach is to adapt a well-known variant of the Benjamini-Hochberg procedure (BHq), making each step differentially private. This destroys the classical(More)
• ArXiv
• 2015
<lb>In regression settings where explanatory variables have very low correlations and where there<lb>are relatively few effects each of large magnitude, it is commonly believed that the Lasso shall be<lb>able to find the important variables with few errors—if any. In contrast, this paper shows that<lb>this is not the case even when the design variables are(More)