Enhancing Top-N Item Recommendations by Peer Collaboration
@article{Sun2022EnhancingTI, title={Enhancing Top-N Item Recommendations by Peer Collaboration}, author={Yang Sun and Fajie Yuan and Min Yang and Alexandros Karatzoglou and Shen Li and Xiaoyan Zhao}, journal={Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval}, year={2022} }
Deep neural networks (DNN) based recommender models often require numerous parameters to achieve remarkable performance. However, this inevitably brings redundant neurons, a phenomenon referred to as over-parameterization. In this paper, we plan to exploit such redundancy phenomena for recommender systems (RS), and propose a top-N item recommendation framework called PCRec that leverages collaborative training of two recommender models of the same network structure, termed peer collaborationβ¦Β
Figures and Tables from this paper
References
SHOWING 1-10 OF 46 REFERENCES
A Generic Network Compression Framework for Sequential Recommender Systems
- Computer ScienceSIGIR
- 2020
A compressed sequential recommendation framework, termed as CpRec, where two generic model shrinking techniques are employed and a block-wise adaptive decomposition to approximate the input and softmax matrices by exploiting the fact that items in SRS obey a long-tailed distribution.
Session-based Recommendations with Recurrent Neural Networks
- Computer ScienceICLR
- 2016
It is argued that by modeling the whole session, more accurate recommendations can be provided by an RNN-based approach for session-based recommendations, and introduced several modifications to classic RNNs such as a ranking loss function that make it more viable for this specific problem.
StackRec: Efficient Training of Very Deep Sequential Recommender Models by Layer Stacking
- Computer ScienceArXiv
- 2020
This work presents StackRec, a simple but very efficient training framework for deep SR models by layer stacking, and proposes progressively stacking such pre-trained residual layers/blocks so as to yield a deeper but easier-to-train SR model.
StackRec: Efficient Training of Very Deep Sequential Recommender Models by Iterative Stacking
- Computer ScienceSIGIR
- 2021
This work proposes the stacking operation on the pre-trained layers/blocks to transfer knowledge from a shallower model to a deep model, then performs iterative stacking so as to yield a much deeper but easier-to-train SR model.
A Simple Convolutional Generative Network for Next Item Recommendation
- Computer ScienceWSDM
- 2019
A simple, but very effective generative model that is capable of learning high-level representation from both short- and long-range item dependencies is introduced that attains state-of-the-art accuracy with less training time in the next item recommendation task.
Future Data Helps Training: Modeling Future Contexts for Session-based Recommendation
- Computer ScienceWWW
- 2020
A new encoder-decoder framework named Gap-filling based Recommender (GRec), which trains the encoder and decoder by a gap-fills mechanism and significantly outperforms the state-of-the-art sequential recommendation methods.
Parameter-Efficient Transfer from Sequential Behaviors for User Modeling and Recommendation
- Computer ScienceSIGIR
- 2020
This paper develops a parameter-efficient transfer learning architecture, termed as PeterRec, which can be configured on-the-fly to various downstream tasks, and shows that PeterRec performs efficient transfer learning in multiple domains, where it achieves comparable or sometimes better performance relative to fine-tuning the entire model parameters.
DeepFM: A Factorization-Machine based Neural Network for CTR Prediction
- Computer ScienceIJCAI
- 2017
This paper shows that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions, and combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture.
Self-Attentive Sequential Recommendation
- Computer Science2018 IEEE International Conference on Data Mining (ICDM)
- 2018
Extensive empirical studies show that the proposed self-attention based sequential model (SASRec) outperforms various state-of-the-art sequential models (including MC/CNN/RNN-based approaches) on both sparse and dense datasets.
DAML: Dual Attention Mutual Learning between Ratings and Reviews for Item Recommendation
- Computer ScienceKDD
- 2019
Experiments show that DAML achieves significantly better rating prediction accuracy compared to the state-of-the-art methods, and the attention mechanism can highlight the relevant information in reviews to increase the interpretability of rating prediction.