#### Filter Results:

- Full text PDF available (13)

#### Publication Year

2007

2017

- This year (1)
- Last 5 years (9)
- Last 10 years (14)

#### Publication Type

#### Co-author

#### Publication Venue

#### Key Phrases

Learn More

- Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford
- ArXiv
- 2016

We present an accelerated gradient method for non-convex optimization problems with Lip-schitz continuous first and second derivatives. The method requires time O(−7/4 log(1//)) to find an-stationary point, meaning a point x such that ∇f (x) ≤. The method improves upon the O(−2) complexity of gradient descent and provides the additional second-order… (More)

- Daniel Soudry, Yair Carmon
- ArXiv
- 2016

We use smoothed analysis techniques to provide guarantees on the training loss of Multilayer Neural Networks (MNNs) at differentiable local minima. Specifically, we examine MNNs with piecewise linear activation functions, quadratic loss and a single output, under mild over-parametrization. We prove that for a MNN with one hidden layer, the training error is… (More)

- Yair Carmon, John C. Duchi, Oliver Hinder, Aaron Sidford
- ICML
- 2017

- Alexander M. Bronstein, Michael M. Bronstein, Yair Carmon, Ron Kimmel
- IPSJ Trans. Computer Vision and Applications
- 2009

Partial matching of geometric structures is important in computer vision , pattern recognition and shape analysis applications. The problem consists of matching similar parts of shapes that may be dissimilar as a whole. Recently, it was proposed to consider partial similarity as a multi-criterion optimization problem trying to simultaneously maximize the… (More)

- Yair Carmon, Shlomo Shamai, Tsachy Weissman
- IEEE Transactions on Information Theory
- 2015

We compare the maximum achievable rates in single-carrier (SC) and orthogonal frequency-division multiplexing (OFDM) modulation schemes, under the practical assumptions of independent identically distributed finite alphabet inputs and linear intersymbol interference with additive Gaussian noise. We show that the Shamai-Laroia approximation serves as a… (More)

- Yair Carmon, Adam Shwartz
- Oper. Res. Lett.
- 2009

We generalize the geometric discount of finite discounted cost Markov Decision Processes to " exponentially representable " discount functions, prove existence of optimal policies which are stationary from some time N onward, and provide an algorithm for their computation. Outside this class, optimal " N-stationary " policies in general do not exist.

- Yair Carmon, John C. Duchi
- ArXiv
- 2016

We consider the minimization of non-convex quadratic forms regularized by a cubic term, which exhibit multiple saddle points and poor local minima. Nonetheless, we prove that, under mild assumptions, gradient descent approximates the global minimum to within ε accuracy in O(ε −1 log(1/ε)) steps for large ε and O(log(1/ε)) steps for small ε (compared to a… (More)

- Yair Carmon, Adam Shwartz
- VALUETOOLS
- 2008

We investigate the existance of simple policies in finite discounted cost Markov Decision Processes, when the discount factor is not constant. We introduce a class called " exponentially representable " discount functions. Within this class we prove existence of optimal policies which are eventually stationary—from some time N onward, and provide an… (More)

- Kartik Venkat, Tsachy Weissman, Yair Carmon, Shlomo Shamai
- IEEE Transactions on Signal Processing
- 2016

We consider mean squared estimation with lookahead of a continuous-time signal corrupted by additive white Gaussian noise. We show that the mutual information rate function, i.e., the mutual information rate as function of the signal-to-noise ratio (SNR), does not, in general, determine the minimum mean squared error (MMSE) with fixed finite lookahead, in… (More)

- Yair Carmon, Shlomo Shamai
- IEEE Transactions on Information Theory
- 2015

We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed independent identically distributed inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input-output mutual information after application of a minimum mean-square error… (More)