Multi-Behavioral Sequential Prediction with Recurrent Log-Bilinear Model

@article{Liu2017MultiBehavioralSP,
  title={Multi-Behavioral Sequential Prediction with Recurrent Log-Bilinear Model},
  author={Q. Liu and Shu Wu and Liang Wang},
  journal={IEEE Transactions on Knowledge and Data Engineering},
  year={2017},
  volume={29},
  pages={1254-1267}
}
  • Q. Liu, Shu Wu, Liang Wang
  • Published 25 August 2016
  • Computer Science
  • IEEE Transactions on Knowledge and Data Engineering
With the rapid growth of Internet applications, sequential prediction in collaborative filtering has become an emerging and crucial task. Given the behavioral history of a specific user, predicting his or her next choice plays a key role in improving various online services. Meanwhile, there are more and more scenarios with multiple types of behaviors, while existing works mainly study sequences with a single type of behavior. As a widely used approach, Markov chain based models are based on a… Expand
Attention with Long-Term Interval-Based Gated Recurrent Units for Modeling Sequential User Behaviors
TLDR
A network featuring Attention with Long-term Interval-based Gated Recurrent Units (ALI-GRU) to model temporal sequences of user actions and a specially matrix-form attention function is designed to learn weights of both long-term preferences and short-term user intents automatically. Expand
Recurrent Convolutional Neural Network for Sequential Recommendation
TLDR
The experimental results show that the model RCNN significantly outperforms the state-of-the-art approaches on sequential recommendation and leverages the convolutional operation of Convolutional Neural Network model to extract short-term sequential patterns among recurrent hidden states. Expand
TiSSA: A Time Slice Self-Attention Approach for Modeling Sequential User Behaviors
TLDR
A novel Time Slice Self-Attention mechanism into RNNs for better modeling sequential user behaviors, which utilizes the time-interval-based gated recurrent units to exploit the temporal dimension when encoding user actions, and has a specially designed time slice hierarchical self-attention function. Expand
A Hierarchical Contextual Attention-based GRU Network for Sequential Recommendation
TLDR
A Hierarchical Contextual Attention-based GRU (HCA-GRU) network that fuse the current hidden state and a contextual hidden state built by the attention mechanism, which leads to a more suitable user's overall interest. Expand
Multi-Context Integrated Deep Neural Network Model for Next Location Prediction
TLDR
A multi-context integrated deep neural network model (MCI-DNN) is proposed to improve the accuracy of the next location prediction and adopts embedding representation technology to automatically learn dense feature representations of input contexts. Expand
Domain Switch-Aware Holistic Recurrent Neural Network for Modeling Multi-Domain User Behavior
TLDR
A practical but overlooked phenomenon in sequential behaviors across multiple domains, i.e.,domain switch where two successive behaviors belong to different domains is introduced and aDomain Switch-Aware Holistic Recurrent Neural Network (DS-HRNN) is proposed that effectively shares the knowledge extracted from multiple domains by systematically handling domain switch for the multi-domain scenario. Expand
Predicting Human Mobility with Semantic Motivation via Multi-task Attentional Recurrent Networks
TLDR
In DeepMove, an attentional recurrent network for mobility prediction from lengthy and sparse trajectories, a multi-modal embedding recurrent neural network is designed to capture the complicated sequential transitions by jointly embedding the multiple factors that govern the human mobility. Expand
MV-RNN: A Multi-View Recurrent Neural Network for Sequential Recommendation
TLDR
Experiments on two real-world datasets show that MV-RNN can effectively generate the personalized ranking list, tackle the missing modalities problem, and significantly alleviate the item cold start problem. Expand
HTDA: Hierarchical time-based directional attention network for sequential user behavior modeling
  • Zhenzhen Sheng, Tao Zhang, Yuejie Zhang
  • Computer Science
  • Neurocomputing
  • 2021
TLDR
A Hierarchical Time-based Directional Attention (HTDA) network is proposed to enhance sequential recommendation by applying fine-grained user intention representation and dynamic user preference representation with rich global sequential interaction features. Expand
CSAN: Contextual Self-Attention Network for User Sequential Recommendation
TLDR
A unified Contextual Self-Attention Network (CSAN) is proposed to address the three properties of heterogeneous user behaviors, which are projected into a common latent semantic space and fed into the feature-wise self-attention network to capture the polysemy of user behaviors. Expand
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 52 REFERENCES
Context-Aware Sequential Recommendation
TLDR
Experimental results show that the proposed CA-RNN model yields significant improvements over state-of-the-art sequential recommendation methods and context-aware recommendation methods on two public datasets, i.e., the Taobao dataset and the Movielens-1M dataset. Expand
Predicting the Next Location: A Recurrent Model with Spatial and Temporal Contexts
TLDR
RNN is extended and a novel method called Spatial Temporal Recurrent Neural Networks (ST-RNN) is proposed, which can model local temporal and spatial contexts in each layer with time-specific transition matrices for different time intervals and distance-specific transitions for different geographical distances. Expand
A Dynamic Recurrent Model for Next Basket Recommendation
TLDR
This work proposes a novel model, Dynamic REcurrent bAsket Model (DREAM), based on Recurrent Neural Network (RNN), which not only learns a dynamic representation of a user but also captures global sequential features among baskets. Expand
Sequential Click Prediction for Sponsored Search with Recurrent Neural Networks
TLDR
A novel framework based on Recurrent Neural Networks (RNN) is introduced that directly models the dependency on user's sequential behaviors into the click prediction process through the recurrent structure in RNN. Expand
Learning Hierarchical Representation Model for NextBasket Recommendation
TLDR
This paper introduces a novel recommendation approach, namely hierarchical representation model (HRM), which can well capture both sequential behavior and users' general taste by involving transaction and user representations in prediction. Expand
Hierarchical Recurrent Neural Networks for Long-Term Dependencies
TLDR
This paper proposes to use a more general type of a-priori knowledge, namely that the temporal dependencies are structured hierarchically, which implies that long-term dependencies are represented by variables with a long time scale. Expand
A Clockwork RNN
TLDR
This paper introduces a simple, yet powerful modification to the simple RNN architecture, the Clockwork RNN (CW-RNN), in which the hidden layer is partitioned into separate modules, each processing inputs at its own temporal granularity, making computations only at its prescribed clock rate. Expand
Factorizing personalized Markov chains for next-basket recommendation
TLDR
This paper introduces an adaption of the Bayesian Personalized Ranking (BPR) framework for sequential basket data and shows that the FPMC model outperforms both the common matrix factorization and the unpersonalized MC model both learned with and without factorization. Expand
Just in Time Recommendations: Modeling the Dynamics of Boredom in Activity Streams
TLDR
This paper analyzes user activity streams and shows that user's temporal consumption of familiar items is driven by boredom, and models this behavior using a Hidden Semi-Markov Model for the gaps between user consumption activities. Expand
Context-aware music recommendation based on latenttopic sequential patterns
TLDR
This paper presents a context-aware music recommender system which infers contextual information based on the most recent sequence of songs liked by the user, and uses topic modeling to determine a set of latent topics for each song, representing different contexts. Expand
...
1
2
3
4
5
...