• Corpus ID: 159042202

RaFM: Rank-Aware Factorization Machines

@inproceedings{Chen2019RaFMRF,
  title={RaFM: Rank-Aware Factorization Machines},
  author={Xiaoshuang Chen and Yin Zheng and Jiaxing Wang and Wenye Ma and Junzhou Huang},
  booktitle={ICML},
  year={2019}
}
Factorization machines (FM) are a popular model class to learn pairwise interactions by a low-rank approximation. Different from existing FM-based approaches which use a fixed rank for all features, this paper proposes a Rank-Aware FM (RaFM) model which adopts pairwise interactions from embeddings with different ranks. The proposed model achieves a better performance on real-world datasets where different features have significantly varying frequencies of occurrences. Moreover, we prove that… 

Figures and Tables from this paper

AdnFM: An Attentive DenseNet based Factorization Machine for Click-Through-Rate Prediction

TLDR
A novel deep learning-based model named AdnFM, to attack the Click-Through-Rate (CTR) prediction problem, which combines residual learning and an attention mechanism, to enable high-order features interactions and weight their importance dynamically.

AdnFM: An Attentive DenseNet based Factorization Machine for CTR Prediction

TLDR
A novel deep learning-based model named AdnFM, to attack the Click-Through-Rate (CTR) prediction problem, which combines residual learning and an attention mechanism, to enable high-order features interactions and weight their importance dynamically.

Deep Interest-Shifting Network with Meta-Embeddings for Fresh Item Recommendation

TLDR
A deep interest-shifting network (DisNet), which transfers knowledge from a huge number of auxiliary data and then shifts user interests with contextual information and a relational meta-Id-embedding generator (RM-IdEG) that integrates relational information for better embedding performance is proposed.

Projective Quadratic Regression for Online Learning

TLDR
A projective quadratic regression (PQR) model is proposed that can capture the import second-order feature information and is a convex model, so the requirements of OCO are fulfilled and the global optimal solution can be achieved.

Factorization Machines with Regularization for Sparse Feature Interactions

TLDR
A new regularization scheme for feature interaction selection in FMs is presented and an upper bound of the $\ell_1$ regularizer for the feature interaction matrix is computed from the parameter matrix of FMs.

Field-wise Learning for Multi-field Categorical Data

TLDR
This work presents a model that utilizes linear models with variance and low-rank constraints, to help it generalize better and reduce the number of parameters, and is also interpretable in a field-wise manner.

References

SHOWING 1-10 OF 17 REFERENCES

A Boosting Framework of Factorization Machine

TLDR
This work proposes an Adaptive Boosting framework of Factorization Machines (AdaFM), which can adaptively search for proper ranks for different datasets without re-training, and will adaptively gradually increases its rank according to its performance until the performance does not grow, using boosting strategy.

Field-aware Factorization Machines for CTR Prediction

TLDR
This paper establishes FFMs as an effective method for classifying large sparse data including those from CTR prediction, and proposes efficient implementations for training FFMs and comprehensively analyze FFMs.

Factorization Machines with libFM

Factorization approaches provide high accuracy in several important prediction problems, for example, recommender systems. However, applying factorization approaches to a new prediction problem is a

DeepFM: A Factorization-Machine based Neural Network for CTR Prediction

TLDR
This paper shows that it is possible to derive an end-to-end learning model that emphasizes both low- and high-order feature interactions, and combines the power of factorization machines for recommendation and deep learning for feature learning in a new neural network architecture.

Mixture-Rank Matrix Approximation for Collaborative Filtering

TLDR
A mixture-rank matrix approximation (MRMA) method is proposed, in which user-item ratings can be characterized by a mixture of LRMA models with different ranks, and a learning algorithm capitalizing on iterated condition modes is proposed to tackle the non-convex optimization problem pertaining to MRMA.

Factorization Machines

  • Steffen Rendle
  • Computer Science
    2010 IEEE International Conference on Data Mining
  • 2010
TLDR
Factorization Machines (FM) are introduced which are a new model class that combines the advantages of Support Vector Machines (SVM) with factorization models and can mimic these models just by specifying the input data (i.e. the feature vectors).

Neural Factorization Machines for Sparse Predictive Analytics

TLDR
NFM seamlessly combines the linearity of FM in modelling second- order feature interactions and the non-linearity of neural network in modelling higher-order feature interactions, and is more expressive than FM since FM can be seen as a special case of NFM without hidden layers.

Collaborative Filtering with User-Item Co-Autoregressive Models

TLDR
This work proposes CF-UIcA, a neural co-autoregressive model for CF tasks, which exploits the structural correlation in the domains of both users and items, and develops an efficient stochastic learning algorithm to handle large scale datasets.

Deep & Cross Network for Ad Click Predictions

TLDR
This paper proposes the Deep & Cross Network (DCN), which keeps the benefits of a DNN model, and beyond that, it introduces a novel cross network that is more efficient in learning certain bounded-degree feature interactions.

DiFacto: Distributed Factorization Machines

TLDR
DiFacto is described, which uses a refined Factorization Machine model with sparse memory adaptive constraints and frequency adaptive regularization, and it is shown how to distribute DiFacto over multiple machines using the Parameter Server framework by computing distributed subgradients on minibatches asynchronously.