Skip to search formSkip to main content>Semantic Scholar Semantic Scholar's Logo

Search

You are currently offline. Some features of the site may not work correctly.

Semantic Scholar uses AI to extract papers important to this topic.

Highly Cited

2019

Highly Cited

2019

Gradient descent finds a global minimum in training deep neural networks despite the objective function being non-convex. The… Expand

Highly Cited

2014

Highly Cited

2014

Modeling the target appearance is critical in many modern visual tracking algorithms. Many tracking-by-detection algorithms… Expand

Highly Cited

2007

Highly Cited

2007

This paper presents a semi-supervised graph-based method for the classification of hyperspectral images. The method is designed… Expand

Highly Cited

2005

Highly Cited

2005

A problem for many kernel-based methods is that the amount of computation required to find the solution scales as O(n 3 ), where… Expand

Highly Cited

2005

Highly Cited

2005

Many of the tools of dynamical systems and control theory have gone largely unused for fluids, because the governing equations… Expand

Highly Cited

2004

Highly Cited

2004

Kernel-based learning algorithms work by embedding the data into a Euclidean space, and then searching for linear relations among… Expand

Highly Cited

2002

Highly Cited

2002

In this paper, we introduce a new method of model reduction for nonlinear control systems. Our approach is
to construct an… Expand

Highly Cited

2001

Highly Cited

2001

The popular K-means clustering partitions a data set by minimizing a sum-of-squares cost function. A coordinate descend method is… Expand

Highly Cited

1984

Highly Cited

1984

An error bound for reduced order models obtained from internally balanced realizations is derived. The bound is that the infinity… Expand

Highly Cited

1960

Highly Cited

1960

THIS is one of the two ground-breaking papers by Kalman that appeared in 1960—with the other one (discussed next) being the… Expand