Learn More
We present new results for the Frank-Wolfe method (also known as the conditional gradient method). We derive computational guarantees for arbitrary step-size sequences, which are then applied to various step-size rules, including simple averaging and constant step-sizes. We also develop step-size rules and computational guarantees that depend naturally on(More)
Motivated principally by the low-rank matrix completion problem, we present an extension of the Frank-Wolfe method that is designed to induce near-optimal solutions on low-dimensional faces of the feasible region. This is accomplished by a new approach to generating " in-face " directions at each iteration, as well as through new choice rules for selecting(More)
Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance. In this paper, we analyze two well-known boosting methods, AdaBoost and Incremental Forward Stagewise Regression (FS ε), by establishing their precise connections to the Mirror Descent(More)
In this paper we analyze boosting algorithms [15, 21, 24] in linear regression from a new perspective: that of modern first-order methods in convex optimization. We show that classic boosting algorithms in linear regression, namely the incremental forward stagewise algorithm (FS ε) and least squares boosting (LS-Boost(ε)), can be viewed as subgradient(More)
This thesis studies contemporary challenges arising at the market, system, and organization levels in the healthcare industry, and develops novel frameworks that allow us to better understand cost and resource allocation for strategic decision making in healthcare settings. The U.S. healthcare industry is going through a massive transformation process due(More)
  • 1