Discrete Factorization Machines for Fast Feature-based Recommendation

@inproceedings{Liu2018DiscreteFM,
  title={Discrete Factorization Machines for Fast Feature-based Recommendation},
  author={Han Liu and Xiangnan He and Fuli Feng and Liqiang Nie and R. Liu and Hanwang Zhang},
  booktitle={International Joint Conference on Artificial Intelligence},
  year={2018}
}
User and item features of side information are crucial for accurate recommendation. However, the large number of feature dimensions, e.g., usually larger than 107, results in expensive storage and computational cost. This prohibits fast recommendation especially on mobile applications where the computational resource is very limited. In this paper, we develop a generic feature-based recommendation model, called Discrete Factorization Machine (DFM), for fast and accurate recommendation. DFM… 

Figures and Tables from this paper

Discrete Social Recommendation

This work proposes a novel discrete social recommendation (DSR) method which learns binary codes in a unified framework for users and items, considering social information, and puts the balanced and uncorrelated constraints on the objective to ensure the learned binary codes can be informative yet compact.

Multi-Feature Discrete Collaborative Filtering for Fast Cold-start Recommendation

This paper proposes a fast cold-start recommendation method, called Multi-Feature Discrete Collaborative Filtering (MFDCF), designed to adaptively project the multiple content features into binary yet informative hash codes by fully exploiting their complementarity.

Memory-Efficient Factorization Machines via Binarizing both Data and Model Coefficients

A new method called Binarized FM is proposed which constraints the model parameters to be binary values (i.e., 1 or −1) and can significantly reduce the memory cost of SEFM model.

Multi-Modal Discrete Collaborative Filtering for Efficient Cold-Start Recommendation

A Multi-modal Discrete Collaborative Filtering (MDCF) for efficient cold-start recommendation is proposed, which map the multi- modal features of users and items to a consensus Hamming space based on the matrix factorization framework to support large-scale recommendation.

Compositional Coding for Collaborative Filtering

This work proposes the Compositional Coding for Collaborative Filtering (CCCF) framework, which not only gains better recommendation efficiency than the state-of-the-art binarized CF approaches but also achieves even higher accuracy than the real-valued CF method.

xLightFM: Extremely Memory-Efficient Factorization Machine

The results demonstrate that xLightFM can outperform the state-of-the-art lightweight factorization-based methods in terms of both prediction quality and memory footprint, and achieve more than 18x and 27x memory compression compared to the vanilla FM on these two datasets, respectively.

Candidate Generation with Binary Codes for Large-Scale Top-N Recommendation

A candidate generation and re-ranking based framework (CIGAR), which first learns a preference-preserving binary embedding for building a hash table to retrieve candidates, and then learns to re-rank the candidates using real-valued ranking models with a candidate-oriented objective.

Learning Elastic Embeddings for Customizing On-Device Recommenders

This paper presents a novel lightweight recommendation paradigm that allows a well-trained recommender to be customized for arbitrary device-specific memory constraints without retraining, and proposes an innovative approach, namely recommendation with universally learned elastic embeddings (RULE).

References

SHOWING 1-10 OF 27 REFERENCES

Discrete Content-aware Matrix Factorization

A Discrete Content-aware Matrix Factorization (DCMF) model is proposed to derive compact yet informative binary codes at the presence of user/item content information and an efficient discrete optimization algorithm for parameter learning is developed.

Fast Matrix Factorization for Online Recommendation with Implicit Feedback

A new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique is designed, for efficiently optimizing a Matrix Factorization (MF) model with variably-weighted missing data and exploiting this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback.

Preference preserving hashing for efficient recommendation

Experiments show that the recommendation speed of the proposed PPH algorithm can be hundreds of times faster than original MF with real-valued features, and the recommendation accuracy is significantly better than previous work of hashing for recommendation.

Collaborative Filtering on a Budget

This paper proposes a new model for representing and compressing matrix factors via hashing that allows for essentially unbounded storage (at a graceful storage / performance trade-off) for users and items to be represented in a pre-defined memory footprint.

A Generic Coordinate Descent Framework for Learning from Implicit Feedback

It is shown that k-separability is a sufficient property to allow efficient optimization of implicit recommender problems with CD, and a new framework for deriving efficient CD algorithms for complex recommender models is provided.

Neural Collaborative Filtering

This work strives to develop techniques based on neural networks to tackle the key problem in recommendation --- collaborative filtering --- on the basis of implicit feedback, and presents a general framework named NCF, short for Neural network-based Collaborative Filtering.

Fast Scalable Supervised Hashing

A novel supervised hashing method, called Fast Scalable Supervised Hashing (FSSH), which circumvents the use of the large similarity matrix by introducing a pre-computed intermediate term whose size is independent with the size of training data.

Neural Attentive Session-based Recommendation

A novel neural networks framework, i.e., Neural Attentive Recommendation Machine (NARM), is proposed to tackle session-based recommendation, which outperforms state-of-the-art baselines on both datasets and achieves a significant improvement on long sessions.

Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks

A novel model named Attentional Factorization Machine (AFM), which learns the importance of each feature interaction from data via a neural attention network, which consistently outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross with a much simpler structure and fewer model parameters.

Google news personalization: scalable online collaborative filtering

This paper describes the approach to collaborative filtering for generating personalized recommendations for users of Google News using MinHash clustering, Probabilistic Latent Semantic Indexing, and covisitation counts, and combines recommendations from different algorithms using a linear model.