Fabio Aiolli

Learn More
The development of neural network (NN) models able to encode structured input, and the more recent definition of kernels for structures, makes it possible to directly apply machine learning approaches to generic structured data. However, the effectiveness of a kernel can depend on its sparsity with respect to a specific data set. In fact, the accuracy of a(More)
Winner-take-all multiclass classifiers are built on the top of a set of prototypes each representing one of the available classes. A pattern is then classified with the label associated to the most ‘similar’ prototype. Recent proposal of SVM extensions to multiclass can be considered instances of the same strategy with one prototype per class. The(More)
Recent results in theoretical machine learning seem to suggest that nice properties of the margin distribution over a training set turns out in a good performance of a classifier. The same principle has been already used in SVM and other kernel based methods as the associated optimization problems try to maximize the minimum of these margins. In this paper,(More)
A crucial issue in machine learning is how to learn appropriate representations for data. Recently, much work has been devoted to kernel learning, that is, the problem of finding a good kernel matrix for a given task. This can be done in a semi-supervised learning setting by using a large set of unlabeled data and a (typically small) set of i.i.d. labeled(More)
We extend multiclass SVM to multiple prototypes per class. For this framework, we give a compact constrained quadratic problem and we suggest an efficient algorithm for its optimization that guarantees a local minimum of the objective function. An annealed process is also proposed that helps to escape from local minima. Finally, we report experiments where(More)