Two-stage Model for Automatic Playlist Continuation at Scale

@article{Volkovs2018TwostageMF,
  title={Two-stage Model for Automatic Playlist Continuation at Scale},
  author={Maksims Volkovs and Himanshu Rai and Zhaoyue Cheng and Ga Wu and Yichao Lu and Scott Sanner},
  journal={Proceedings of the ACM Recommender Systems Challenge 2018},
  year={2018}
}
Automatic playlist continuation is a prominent problem in music recommendation. Significant portion of music consumption is now done online through playlists and playlist-like online radio stations. Manually compiling playlists for consumers is a highly time consuming task that is difficult to do at scale given the diversity of tastes and the large amount of musical content available. Consequently, automated playlist continuation has received increasing attention recently [1, 7, 11]. The 2018… Expand
An Analysis of Approaches Taken in the ACM RecSys Challenge 2018 for Automatic Music Playlist Continuation
TLDR
The ACM Recommender Systems Challenge 2018 focused on the task of automatic music playlist continuation, which is a form of the more general task of sequential recommendation, to recommend up to 500 tracks that fit the target characteristics of the original playlist. Expand
Representation, Exploration and Recommendation of Music Playlists
TLDR
This work formulate the problem of learning a fixed-length playlist representation in an unsupervised manner, using Sequence-to-sequence (Seq2seq) models, interpreting playlists as sentences and songs as words, and compares the model with two other encoding architectures for baseline comparison. Expand
Using Latent Semantics of Playlist Titles and Descriptions to Enhance Music Recommendations
Music playlists, either user-generated or curated by music streaming services, often come with titles and descriptions. While crucial to music recommendations, leveraging titles and descriptions isExpand
Music Artist Classification with Convolutional Recurrent Neural Networks
  • Zain Nasrullah, Yue Zhao
  • Computer Science, Engineering
  • 2019 International Joint Conference on Neural Networks (IJCNN)
  • 2019
TLDR
The results improve upon baseline works, verify the influence of the producer effect on classification performance and demonstrate the trade-offs between audio length and training set size. Expand
Evaluating Recommender System Algorithms for Generating Local Music Playlists
TLDR
This paper alters the standard evaluation procedure such that the algorithms only rank tracks by local artists for each of the eight different cities, and finds that the neighborhood-based approach (IIN) performs best for long-tail local music recommendation. Expand
Modeling Popularity and Temporal Drift of Music Genre Preferences
TLDR
A novel user modeling approach, BLLu, which takes into account the popularity of music genres as well as temporal drifts of user listening behavior and adopts a psychological model that describes how humans access information in their memory is introduced. Expand
Consistency-Aware Recommendation for User-Generated Item List Continuation
TLDR
A generalizable approach that models the consistency of item lists based on human curation patterns, and so can be deployed across a wide range of varying item types, and can dynamically adapt as user preferences evolve. Expand
Why Are Deep Learning Models Not Consistently Winning Recommender Systems Competitions Yet?: A Position Paper
TLDR
This paper investigates possible reasons for the almost consistent success of DL based models in recommendation-related machine learning competitions and considers multiple possible factors such as the characteristics and complexity of the problem settings, datasets, and DL methods. Expand
Skip prediction using boosting trees based on acoustic features of tracks in sessions
TLDR
The approach to this problem and the final system that was submitted to the challenge are described, which consists in combining the predictions of multiple boosting trees models trained with features extracted from the sessions and the tracks. Expand
Noise Contrastive Estimation for One-Class Collaborative Filtering
TLDR
NCE item embeddings combined with a personalized user model from PLRec produces superior recommendations that adequately account for popularity bias, and analysis of the popularity distribution of recommended items demonstrates that NCE-PLRec uniformly distributes recommendations over the popularity spectrum while other methods exhibit distinct biases towards specific popularity subranges. Expand
...
1
2
...

References

SHOWING 1-10 OF 23 REFERENCES
Recsys challenge 2018: automatic music playlist continuation
TLDR
The ACM Recommender Systems Challenge 2018 focused on automatic music playlist continuation, which is a form of the more general task of sequential recommendation, to recommend up to 500 tracks that fit the target characteristics of the original playlist. Expand
Automatic playlist generation based on tracking user’s listening habits
TLDR
Two simple algorithms that track the listening habits and form a listener model—a profile of listening habits are developed and the listener model is then used for automatic playlist generation. Expand
Large-scale user modeling with recurrent neural networks for music discovery on multiple time scales
TLDR
This paper presents a new approach to model users through recurrent neural networks by sequentially processing consumed items, represented by any type of embeddings and other context features, and obtains semantically rich user representations, which capture a user's musical taste over time. Expand
Steerable Playlist Generation by Learning Song Similarity from Radio Station Playlists
TLDR
This paper demonstrates a method for learning song transition probabilities from audio features extracted from songs played in professional radio station playlists and is able to generate steerable playlists by choosing the next song to play not simply based on that prior, but on a tag cloud that the user is able of manipulate to express the high-level characteristics of the music he wishes to listen to. Expand
Deep Neural Networks for YouTube Recommendations
TLDR
This paper details a deep candidate generation model and then describes a separate deep ranking model and provides practical lessons and insights derived from designing, iterating and maintaining a massive recommendation system with enormous user-facing impact. Expand
Language Modeling with Gated Convolutional Networks
TLDR
A finite context approach through stacked convolutions, which can be more efficient since they allow parallelization over sequential tokens, is developed and is the first time a non-recurrent approach is competitive with strong recurrent models on these large scale language tasks. Expand
Factorization meets the neighborhood: a multifaceted collaborative filtering model
TLDR
The factor and neighborhood models can now be smoothly merged, thereby building a more accurate combined model and a new evaluation metric is suggested, which highlights the differences among methods, based on their performance at a top-K recommendation task. Expand
An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling
TLDR
A systematic evaluation of generic convolutional and recurrent architectures for sequence modeling concludes that the common association between sequence modeling and recurrent networks should be reconsidered, and convolutionals should be regarded as a natural starting point for sequence modeled tasks. Expand
Collaborative Filtering for Implicit Feedback Datasets
TLDR
This work identifies unique properties of implicit feedback datasets and proposes treating the data as indication of positive and negative preference associated with vastly varying confidence levels, which leads to a factor model which is especially tailored for implicit feedback recommenders. Expand
Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift
TLDR
Applied to a state-of-the-art image classification model, Batch Normalization achieves the same accuracy with 14 times fewer training steps, and beats the original model by a significant margin. Expand
...
1
2
3
...