#### Filter Results:

#### Publication Year

2002

2009

#### Co-author

#### Key Phrase

#### Publication Venue

Learn More

Hand amputees would highly benefit from a robotic prosthesis, which would allow the movement of a number of fingers. In this paper we propose using the electromyographic signals recorded by two pairs of electrodes placed over the arm for operating such prosthesis. Multiple features from these signals are extracted whence the most relevant features are… (More)

We consider support vector machines for binary classification. As opposed to most approaches we use the number of support vectors (the <i>"L</i><inf>0</inf> norm") as a regularizing term instead of the <i>L</i><inf>1</inf> or <i>L</i><inf>2</inf> norms. In order to solve the optimization problem we use the cross entropy method to search over the possible… (More)

Sparsity plays an important role in many fields of engineering. The cardinality penalty function, often used as a measure of sparsity, is neither continuous nor differentiable and therefore smooth optimization algorithms cannot be applied directly. In this paper we present a continuous yet non-differentiable sparsity function which constitutes a tight lower… (More)

A new sparsity driven kernel classifier is presented based on the minimization of a recently derived data-dependent generalization error bound. The objective function consists of the usual hinge loss function penalizing training errors and a concave penalty function of the expansion coefficients. The problem of minimizing the non-convex bound is addressed… (More)

- Dori Peleg, Ron Meir
- NIPS
- 2004

A novel linear feature selection algorithm is presented based on the global minimization of a data-dependent generalization error bound. Feature selection and scaling algorithms often lead to non-convex optimization problems, which in many previous approaches were addressed through gradient descent procedures that can only guarantee convergence to a local… (More)

- ‹
- 1
- ›