Neural Factorization Machines for Sparse Predictive Analytics

@article{He2017NeuralFM,
  title={Neural Factorization Machines for Sparse Predictive Analytics},
  author={Xiangnan He and Tat-Seng Chua},
  journal={Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval},
  year={2017}
}
  • Xiangnan He, Tat-Seng Chua
  • Published 7 August 2017
  • Computer Science
  • Proceedings of the 40th International ACM SIGIR Conference on Research and Development in Information Retrieval
Many predictive tasks of web applications need to model categorical variables, such as user IDs and demographics like genders and occupations. To apply standard machine learning techniques, these categorical predictors are always converted to a set of binary features via one-hot encoding, making the resultant feature vector highly sparse. To learn from such sparse data effectively, it is crucial to account for the interactions between features. Factorization Machines (FMs) are a popular… 

Figures and Tables from this paper

DS-FACTO: Doubly Separable Factorization Machines

TLDR
A hybrid-parallel stochastic optimization algorithm DS-FACTO is proposed, which partitions both the data as well as parameters of the factorization machine simultaneously and is fully de-centralized and does not require the use of any parameter servers.

IO-aware Factorization Machine for User Response Prediction

TLDR
A novel model named IO-aware Factorization Machine (IOFM), which enhances the feature representation ability of attention mechanism in estimating weights via two awareness auxiliary matrices and further reduces the model parameters using canonical decomposition for the two auxiliaryMatrices and design a shared matrix to correlate the decomposed matrices.

A Hybrid Neural Network Model with Non-linear Factorization Machines for Collaborative Recommendation

TLDR
A novel model Non-Linear Factorization Machine (NLFM) for modelling user-item interaction function and a hybrid deep model named AE-NLFM for collaborative recommendation that significantly outperforms the state-of-the-art methods.

Quaternion Factorization Machines: A Lightweight Solution to Intricate Feature Interaction Modelling

TLDR
By introducing a brand new take on FM-based models with the notion of quaternion algebra, these models not only enable expressive inter-component feature interactions but also significantly reduce the parameter size due to lower degrees of freedom in the hypercomplex Hamilton product compared with real-valued matrix multiplication.

DexDeepFM: Ensemble Diversity Enhanced Extreme Deep Factorization Machine Model

TLDR
An ensemble diversity enhanced extreme deep factorization machine model (DexDeepFM) is proposed, which designs the ensemble diversity measure in each hidden layer and considers both ensemble diversity and prediction accuracy in the objective function.

You Say Factorization Machine, I Say Neural Network - It’s All in the Activation

TLDR
This work extends the previously established theoretical connection between polynomial neural networks and factorization machines (FM) to recently introduced FM techniques to propose a single neural-network-based framework that can switch between the deep learning and FM paradigms by a simple change of an activation function.

Field-Aware Neural Factorization Machine for Click-Through Rate Prediction

TLDR
The FNFM model has stronger expression ability than previous deep learning feature combination models, such as the DeepFM, DCN, and NFM, and is used for higher order feature combination learning.

Boosting Factorization Machines via Saliency-Guided Mixup

TLDR
Through theoretical analysis, it is proved that the proposed methods minimize the upper bound of the generalization error, which hold a beneficial effect on enhancing FMs and gives the first generalization bound of FM.

DCAP: Deep Cross Attentional Product Network for User Response Prediction

TLDR
This work proposes a novel architecture Deep Cross Attentional Product Network (DCAP), which keeps cross network's benefits in modeling high-order feature interactions explicitly at the vector-wise level, and can differentiate the importance of different cross features in each network layer inspired by the multi-head attention mechanism and Product Neural Network.
...

References

SHOWING 1-10 OF 45 REFERENCES

Deep Learning over Multi-field Categorical Data - - A Case Study on User Response Prediction

TLDR
This paper proposes two novel models using deep neural networks (DNNs) to automatically learn effective patterns from categorical feature interactions and make predictions of users' ad clicks and demonstrates that their methods work better than major state-of-the-art models.

Wide & Deep Learning for Recommender Systems

TLDR
Wide & Deep learning is presented---jointly trained wide linear models and deep neural networks---to combine the benefits of memorization and generalization for recommender systems and is open-sourced in TensorFlow.

Exponential Machines

TLDR
This paper introduces Exponential Machines (ExM), a predictor that models all interactions of every order in a factorized format called Tensor Train (TT), and shows that the model achieves state-of-the-art performance on synthetic data with high-order interactions and works on par on a recommender system dataset MovieLens 100K.

Factorization Machines with libFM

Factorization approaches provide high accuracy in several important prediction problems, for example, recommender systems. However, applying factorization approaches to a new prediction problem is a

Neural Collaborative Filtering

TLDR
This work strives to develop techniques based on neural networks to tackle the key problem in recommendation --- collaborative filtering --- on the basis of implicit feedback, and presents a general framework named NCF, short for Neural network-based Collaborative Filtering.

Field-aware Factorization Machines for CTR Prediction

TLDR
This paper establishes FFMs as an effective method for classifying large sparse data including those from CTR prediction, and proposes efficient implementations for training FFMs and comprehensively analyze FFMs.

Predicting response in mobile advertising with hierarchical importance-aware factorization machine

TLDR
A Hierarchical Importance-aware Factorization Machine (HIFM) is developed, which provides an effective generic latent factor framework that incorporates importance weights and hierarchical learning and outperforms the contemporary temporal latent factor models.

Fast Matrix Factorization for Online Recommendation with Implicit Feedback

TLDR
A new learning algorithm based on the element-wise Alternating Least Squares (eALS) technique is designed, for efficiently optimizing a Matrix Factorization (MF) model with variably-weighted missing data and exploiting this efficiency to then seamlessly devise an incremental update strategy that instantly refreshes a MF model given new feedback.

Factorization Machines

  • Steffen Rendle
  • Computer Science
    2010 IEEE International Conference on Data Mining
  • 2010
TLDR
Factorization Machines (FM) are introduced which are a new model class that combines the advantages of Support Vector Machines (SVM) with factorization models and can mimic these models just by specifying the input data (i.e. the feature vectors).

Exploiting ranking factorization machines for microblog retrieval

TLDR
A Ranking Factorization Machine (Ranking FM) model is proposed, which applies Factorization machine model to microblog ranking on basis of pairwise classification, and demonstrates its superiority over several baseline systems on a real Twitter dataset in terms of P@30 and MAP metrics.