Nonparametric bayesian multitask collaborative filtering

@article{Chatzis2013NonparametricBM,
  title={Nonparametric bayesian multitask collaborative filtering},
  author={Sotirios P. Chatzis},
  journal={Proceedings of the 22nd ACM international conference on Information \& Knowledge Management},
  year={2013}
}
  • S. Chatzis
  • Published 27 October 2013
  • Computer Science
  • Proceedings of the 22nd ACM international conference on Information & Knowledge Management
The dramatic rates new digital content becomes available has brought collaborative filtering systems to the epicenter of computer science research in the last decade. One of the greatest challenges collaborative filtering systems are confronted with is the data sparsity problem: users typically rate only very few items; thus, availability of historical data is not adequate to effectively perform prediction. To alleviate these issues, in this paper we propose a novel multitask collaborative… 

Figures and Tables from this paper

Robust Representations for Collaborative Filtering
TLDR
A general deep architecture for CF is proposed by integrating matrix factorization with deep feature learning, which leads to a parsimonious fit over the latent features as indicated by its improved performance in comparison to prior state-of-art models over four large datasets for the tasks of movie/book recommendation and response prediction.
Cross-domain recommender system through tag-based models
TLDR
This thesis aims to develop novel tag-based cross-domain recommendation models in disjoint domains by exploiting domainspecific tags, tag-inferred structural knowledge and tag semantics, respectively and provides practical guidance for handling unstructured tag data in recommendation tasks.
Deep Collaborative Filtering via Marginalized Denoising Auto-encoder
TLDR
A general deep architecture for CF is proposed by integrating matrix factorization with deep feature learning, which leads to a parsimonious fit over the latent features as indicated by its improved performance in comparison to prior state-of-art models over four large datasets for the tasks of movie/book recommendation and response prediction.
Recurrent Latent Variable Networks for Session-Based Recommendation
TLDR
This work seeks to devise a machine learning mechanism capable of extracting subtle and complex underlying temporal dynamics in the observed session data, so as to inform the recommendation algorithm, by adopting concepts from the field of Bayesian statistics, namely variational inference.
A time-sensitive personalized recommendation method based on probabilistic matrix factorization technique
TLDR
A novel time-sensitive personalized recommendation method called TSPR for movie recommendation is proposed and a novel context-dependent similarity measurement by mining the implicit relationship among users from the user–context rating matrix is proposed.
Algorithms of Unconstrained Non-Negative Latent Factor Analysis for Recommender Systems
TLDR
This work innovatively transfer the non-negativity constraints from the decision parameters to the output LFs, and connect them through a single-element-dependent mapping function, and theoretically prove that by making a mapping function fulfill specific conditions, the resultant model is able to represent the original one precisely.
An Efficient Second-Order Approach to Factorize Sparse Matrices in Recommender Systems
TLDR
This work proposes the Hessian-free optimization-based LF model, which is able to extract latent factors from the given incomplete matrices via a second-order optimization process, and is a promising model for implementing high-performance recommenders.
A Nonnegative Latent Factor Model for Large-Scale Sparse Matrices in Recommender Systems via Alternating Direction Method
TLDR
An alternating direction method (ADM)-based nonnegative latent factor (ANLF) model is proposed, which ensures fast convergence and high prediction accuracy, as well as the maintenance of nonnegativity constraints.
Latent Factor-Based Recommenders Relying on Extended Stochastic Gradient Descent Algorithms
TLDR
Experimental results on two HiDS matrices generated by real recommender systems show that compared with an LF model with a standard SGD algorithm, an LF models with extended ones can achieve: higher prediction accuracy for missing data; faster convergence rate; and 3) model diversity.
...
1
2
3
...

References

SHOWING 1-10 OF 34 REFERENCES
Multi-Domain Collaborative Filtering
TLDR
A Probabilistic framework is proposed which uses probabilistic matrix factorization to model the rating problem in each domain and allows the knowledge to be adaptively transferred across different domains by automatically learning the correlation between domains.
Fast nonparametric matrix factorization for large-scale collaborative filtering
TLDR
These experiments on EachMovie and Netflix, the two largest public benchmarks to date, demonstrate that the nonparametric models make more accurate predictions of user ratings, and are computationally comparable or sometimes even faster in training, in comparison with previous state-of-the-art parametric matrix factorization models.
Transfer learning for collaborative filtering via a rating-matrix generative model
TLDR
The proposed rating-matrix generative model (RMGM) can share the knowledge by pooling the rating data from multiple tasks even when the users and items of these tasks do not overlap, and can outperform the individual models trained separately.
Probabilistic Matrix Factorization
TLDR
The Probabilistic Matrix Factorization (PMF) model is presented, which scales linearly with the number of observations and performs well on the large, sparse, and very imbalanced Netflix dataset and is extended to include an adaptive prior on the model parameters.
Bayesian latent variable models for collaborative item rating prediction
TLDR
A Bayesian latent variable model for rating prediction that models ratings over each user's latent interests and also each item's latent topics and is able to outperform 3 common baselines, including a very competitive and modern SVD-based model.
Latent semantic models for collaborative filtering
TLDR
A new family of model-based algorithms designed for collaborative filtering rely on a statistical modelling technique that introduces latent class variables in a mixture model setting to discover user communities and prototypical interest profiles.
Transfer Learning in Collaborative Filtering for Sparsity Reduction
TLDR
This paper discovers the principle coordinates of both users and items in the auxiliary data matrices, and transfers them to the target domain in order to reduce the effect of data sparsity.
Multi-task Learning for Bayesian Matrix Factorization
  • Chao Yuan
  • Computer Science
    2011 IEEE 11th International Conference on Data Mining
  • 2011
TLDR
This work extends the previous work of Bayesian matrix factorization with Dirichlet process mixture into a multi-task learning approach by sharing latent parameters among different tasks, which does not require object identities and thus is more widely applicable.
Factorization meets the neighborhood: a multifaceted collaborative filtering model
TLDR
The factor and neighborhood models can now be smoothly merged, thereby building a more accurate combined model and a new evaluation metric is suggested, which highlights the differences among methods, based on their performance at a top-K recommendation task.
Bayesian Matrix Factorization with Side Information and Dirichlet Process Mixtures
TLDR
A Bayesian matrix factorization model that performs regression against side information known about the data in addition to the observations is introduced and applied to the Netflix Prize problem of predicting movie ratings given an incomplete user-movie ratings matrix.
...
1
2
3
4
...