• Publications
  • Influence
Learning with Noisy Labels
TLDR
The problem of binary classification in the presence of random classification noise is theoretically studied—the learner sees labels that have independently been flipped with some small probability, and methods used in practice such as biased SVM and weighted logistic regression are provably noise-tolerant.
PAC Subset Selection in Stochastic Multi-armed Bandits
TLDR
The expected sample complexity bound for LUCB is novel even for single-arm selection, and a lower bound on the worst case sample complexity of PAC algorithms for Explore-m is given.
On Iterative Hard Thresholding Methods for High-dimensional M-Estimation
TLDR
This work provides the first analysis for IHT-style methods in the high dimensional statistical setting with bounds that match known minimax lower bounds and extends the analysis to the problem of low-rank matrix recovery.
Composite Objective Mirror Descent
TLDR
This work presents a new method for regularized convex optimization that unifies previously known firstorder algorithms, such as the projected gradient method, mirror descent, and forwardbackward splitting, and derives specific instantiations of this method for commonly used regularization functions,such as l1, mixed norm, and trace-norm.
On the Complexity of Linear Prediction: Risk Bounds, Margin Bounds, and Regularization
This work characterizes the generalization ability of algorithms whose predictions are linear in the input vector. To this end, we provide sharp bounds for Rademacher and Gaussian complexities of
REGAL: A Regularization based Algorithm for Reinforcement Learning in Weakly Communicating MDPs
TLDR
An algorithm is provided that achieves the optimal regret rate in an unknown weakly communicating Markov Decision Process (MDP) where, in each episode, it picks a policy using regularization based on the span of the optimal bias vector.
Smoothness, Low Noise and Fast Rates
We establish an excess risk bound of O(HR2n + √HL*Rn) for ERM with an H-smooth loss function and a hypothesis class with Rademacher complexity Rn, where L* is the best risk achievable by the
On the Consistency of Multiclass Classification Methods
TLDR
It turns out that one can lose consistency in generalizing a binary classification method to deal with multiple classes, so a rich family of multiclass methods are studied to provide a necessary and sufficient condition for their consistency.
Just-in-Time Adaptive Interventions (JITAIs) in Mobile Health: Key Components and Design Principles for Ongoing Health Behavior Support
TLDR
It is critical that researchers develop sophisticated and nuanced health behavior theories capable of guiding the construction of JITAIs and particular attention has to be given to better understanding the implications of providing timely and ecologically sound support for intervention adherence and retention.
Stochastic methods for l1 regularized loss minimization
TLDR
The theoretical runtime analysis suggests that the stochastic methods should outperform state-of-the-art deterministic approaches, including their deterministic counterparts, when the size of the problem is large.
...
1
2
3
4
5
...