Towards Neural Mixture Recommender for Long Range Dependent User Sequences

@article{Tang2019TowardsNM,
  title={Towards Neural Mixture Recommender for Long Range Dependent User Sequences},
  author={Jiaxi Tang and F. Belletti and S. Jain and Minmin Chen and Alex Beutel and Can Xu and Ed Huai-hsin Chi},
  journal={The World Wide Web Conference},
  year={2019}
}
Understanding temporal dynamics has proved to be highly valuable for accurate recommendation. Sequential recommenders have been successful in modeling the dynamics of users and items over time. However, while different model architectures excel at capturing various temporal ranges or dynamics, distinct application contexts require adapting to diverse behaviors. In this paper we examine how to build a model that can make use of different temporal ranges and dynamics depending on the request… Expand
37 Citations
Time-weighted Attentional Session-Aware Recommender System
  • PDF
Déjà vu: A Contextualized Temporal Attention Mechanism for Sequential Recommendation
  • 7
  • Highly Influenced
  • PDF
Towards recommendation with user action sequences
  • PDF
Multitask Mixture of Sequential Experts for User Activity Streams
  • 4
  • PDF
Temporal Heterogeneous Interaction Graph Embedding for Next-Item Recommendation
  • 1
  • Highly Influenced
  • PDF
SDM: Sequential Deep Matching Model for Online Large-scale Recommender System
  • 23
  • PDF
Modeling the Past and Future Contexts for Session-based Recommendation
  • 2
Quantifying Long Range Dependence in Language and User Behavior to improve RNNs
  • 6
  • PDF
...
1
2
3
4
...

References

SHOWING 1-10 OF 10 REFERENCES
Fusing Similarity Models with Markov Chains for Sparse Sequential Recommendation
  • 190
  • Highly Influential
  • PDF
Session-based Recommendations with Recurrent Neural Networks
  • 825
  • Highly Influential
  • PDF
Factorizing personalized Markov chains for next-basket recommendation
  • 833
  • Highly Influential
  • PDF
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
  • 1,066
  • Highly Influential
  • PDF
Attention is All you Need
  • 18,111
  • Highly Influential
  • PDF
WaveNet: A Generative Model for Raw Audio
  • 3,489
  • Highly Influential
  • PDF
On Using Very Large Target Vocabulary for Neural Machine Translation
  • 773
  • Highly Influential
  • PDF
Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
  • 11,489
  • Highly Influential
  • PDF
Neural Machine Translation by Jointly Learning to Align and Translate
  • 15,328
  • Highly Influential
  • PDF
Rectified Linear Units Improve Restricted Boltzmann Machines
  • 10,343
  • Highly Influential
  • PDF