Skip to search formSkip to main content>Semantic Scholar Semantic Scholar's Logo

Search

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

2019

2019

In this paper, we study accelerated Regularized Newton Methods for minimizing objectives formed as a sum of two functions: one is… Expand

2019

2019

For a symmetric positive semidefinite linear system of equations $$\mathcal{Q}{{\varvec{x}}}= {{\varvec{b}}}$$Qx=b, where… Expand

Highly Cited

2018

Highly Cited

2018

We consider a class of difference-of-convex (DC) optimization problems whose objective is level-bounded and is the sum of a… Expand

2016

2016

In this paper we study distributionally robust constraints on risk measures (such as standard deviation less the mean… Expand

Highly Cited

2015

Highly Cited

2015

We study accelerated mirror descent dynamics in continuous and discrete time. Combining the original continuous-time motivation… Expand

Review

2009

Review

2009

Computational convex analysis algorithms have been rediscovered several times in the past by researchers from different fields… Expand

Review

2005

Review

2005

Due to their axiomatic foundation and their favorable computational properties convex risk measures are becoming a powerful tool… Expand

2005

2005

In order to minimize a closed convex function that is approximated by a sequence of better behaved functions, we investigate the… Expand

1996

1996

Let X be a real Hilbert space endowed with inner product 〈., .〉 and associated norm ‖.‖, and let f be a proper closed convex… Expand

Highly Cited

1974

Highly Cited

1974

Kuhn-Tucker condition (0, 0) e dK(x, y) into a more explicit and familiar form. Writing where -k(y) = /0(x) + ylfl(x… Expand