#### Filter Results:

- Full text PDF available (41)

#### Publication Year

1999

2017

- This year (5)
- Last 5 years (23)
- Last 10 years (43)

#### Publication Type

#### Co-author

#### Journals and Conferences

#### Data Set Used

#### Key Phrases

Learn More

- Fabio Aiolli
- RecSys
- 2013

We present a simple and scalable algorithm for top-N recommendation able to deal with very large datasets and (binary rated) implicit feedback. We focus on memory-based collaborative filtering algorithms similar to the well known neighboor based technique for explicit feedback. The major difference, that makes the algorithm particularly scalable, is that it… (More)

- Fabio Aiolli
- RecSys
- 2014

In this paper, an effective collaborative filtering algorithm for top-N item recommendation with implicit feedback is proposed. The task of top-N item recommendation is to predict a ranking of items (movies, books, songs, or products in general) that can be of interest for a user based on earlier preferences of the user. We focus on implicit feedback where… (More)

- Fabio Aiolli, Giovanni Da San Martino, Markus Hagenbuchner, Alessandro Sperduti
- IEEE Transactions on Neural Networks
- 2009

The development of neural network (NN) models able to encode structured input, and the more recent definition of kernels for structures, makes it possible to directly apply machine learning approaches to generic structured data. However, the effectiveness of a kernel can depend on its sparsity with respect to a specific data set. In fact, the accuracy of a… (More)

- Fabio Aiolli
- IIR
- 2013

In this paper the preliminary study we are conducting on the Million Songs Dataset (MSD) challenge is described. The task of the competition is to suggest a set of songs to a user given half of its listening history and complete listening history of other 1 million people. We focus on memory-based collaborative filtering approaches since they are able to… (More)

- Fabio Aiolli, Alessandro Sperduti
- Journal of Machine Learning Research
- 2005

Winner-take-all multiclass classifiers are built on the top of a set of prototypes each representing one of the available classes. A pattern is then classified with the label associated to the most ‘similar’ prototype. Recent proposal of SVM extensions to multiclass can be considered instances of the same strategy with one prototype per class. The… (More)

- Fabio Aiolli, Michele Donini
- Neurocomputing
- 2015

The goal of Multiple Kernel Learning (MKL) is to combine kernels derived from multiple sources in a data-driven way with the aim to enhance the accuracy of a target kernel machine. State-of-the-art methods of MKL have the drawback that the time required to solve the associated optimization problem grows (typically more than linearly) with the number of… (More)

Almost all tree kernels proposed in the literature match substructures without taking into account their relative positioning with respect to one another. In this paper, we propose a novel family of kernels which explicitly focus on this type of information. Specifically, after defining a family of tree kernels based on routes between nodes, we present an… (More)

Recent results in theoretical machine learning seem to suggest that nice properties of the margin distribution over a training set turns out in a good performance of a classifier. The same principle has been already used in SVM and other kernel based methods as the associated optimization problems try to maximize the minimum of these margins. In this paper,… (More)

- Fabio Aiolli
- ICML Unsupervised and Transfer Learning
- 2012

A crucial issue in machine learning is how to learn appropriate representations for data. Recently, much work has been devoted to kernel learning, that is, the problem of finding a good kernel matrix for a given task. This can be done in a semi-supervised learning setting by using a large set of unlabeled data and a (typically small) set of i.i.d. labeled… (More)

- Fabio Aiolli, Alessandro Sperduti
- IJCAI
- 2003

We extend multiclass SVM to multiple prototypes per class. For this framework, we give a compact constrained quadratic problem and we suggest an efficient algorithm for its optimization that guarantees a local minimum of the objective function. An annealed process is also proposed that helps to escape from local minima. Finally, we report experiments where… (More)