Neural Network Matrix Factorization
@article{Dziugaite2015NeuralNM, title={Neural Network Matrix Factorization}, author={Gintare Karolina Dziugaite and Daniel M. Roy}, journal={ArXiv}, year={2015}, volume={abs/1511.06443} }
Data often comes in the form of an array or matrix. Matrix factorization techniques attempt to recover missing or corrupted entries by assuming that the matrix can be written as the product of two low-rank matrices. In other words, matrix factorization approximates the entries of the matrix by a simple, fixed function---namely, the inner product---acting on the latent feature vectors for the corresponding row and column. Here we consider replacing the inner product by an arbitrary function that…
118 Citations
Co-manifold Matrix Factorization
- Computer ScienceICCPR
- 2020
A new matrix factorization model is proposed, Co-Manifold Matrix Factorization (CoMMF), which incorporates the geometric properties of the rating matrix into Matrix factorization and can outperform six state-of-the-art collaborative filtering methods.
Learning Discrete Matrix Factorization Models
- Computer ScienceIEEE Signal Processing Letters
- 2018
A novel method is proposed that allows gradient-based and deep-learning-based methods to jointly learn both the matrix factorization model and a discretization operator and obtains a discrete matrix completion algorithm with high reconstruction accuracy.
Spectral Geometric Matrix Completion
- Computer ScienceMSML
- 2021
This work interprets the Deep Matrix Factorization model through the lens of spectral geometry, which allows it to incorporate explicit regularization without breaking the DMF structure, thus enjoying the best of both worlds.
Matrix Factorization with Neural Networks and Stochastic Variational Inference
- Computer Science
- 2016
This model is extended by using variational Bayesian inference to approximate the posterior distributions of the latent variables, which provides the flexibility to recommender systems to make decisions based on the uncertainty of predicted ratings.
Neural Collaborative Filtering vs. Matrix Factorization Revisited
- Computer ScienceRecSys
- 2020
It is shown that with a proper hyperparameter selection, a simple dot product substantially outperforms the proposed learned similarities and that MLPs should be used with care as embedding combiner and that dot products might be a better default choice.
CoSTCo: A Neural Tensor Completion Model for Sparse Tensors
- Computer ScienceKDD
- 2019
This work proposes a novel convolutional neural network (CNN) based model, named CoSTCo (Convolutional Sparse Tensor Completion), which leverages the expressive power of CNN to model the complex interactions inside tensors and its parameter sharing scheme to preserve the desired low-rank structure.
Extendable Neural Matrix Completion
- Computer Science2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
- 2018
A deep two-branch neural network model that not only inherits the predictive power of neural networks, but is also capable of extending to partially observed samples outside the training set, without the need of retraining or fine-tuning is proposed.
Neural Tensor Factorization
- Computer ScienceWSDM 2019
- 2018
A Neural Tensor Factorization (NTF) model is proposed for predictive tasks on dynamic relational data that incorporates the multi-layer perceptron structure for learning the non-linearities between different latent factors.
Inductive Matrix Completion Based on Graph Neural Networks
- Computer ScienceICLR
- 2020
It is possible to train inductive matrix completion models without using side information while achieving similar or better performances than state-of-the-art transductive methods; local graph patterns around a (user, item) pair are effective predictors of the rating this user gives to the item; and long-range dependencies might not be necessary for modeling recommender systems.
A Biased Deep Tensor Factorization Network For Tensor Completion
- Computer ScienceArXiv
- 2021
The large-scale 5-minute traffic speed data set is considered and the proposed completion method for a tensor based on a Biased Deep Tensor Factorization Network (BDTFN) is proposed to address the missing data imputation problem for large- scale spatiotemporal traffic data.
References
SHOWING 1-10 OF 23 REFERENCES
Probabilistic Matrix Factorization with Non-random Missing Data
- Computer ScienceICML
- 2014
A probabilistic matrix factorization model for collaborative filtering that learns from data that is missing not at random (MNAR) to obtain improved performance over state-of-the-art methods when predicting the ratings and when modeling the data observation process.
Probabilistic Matrix Factorization
- Computer ScienceNIPS
- 2007
The Probabilistic Matrix Factorization (PMF) model is presented, which scales linearly with the number of observations and performs well on the large, sparse, and very imbalanced Netflix dataset and is extended to include an adaptive prior on the model parameters.
Local Low-Rank Matrix Approximation
- Computer ScienceICML
- 2013
A new matrix approximation model is proposed where it is assumed that the matrix is locally of low-rank, leading to a representation of the observed matrix as a weighted sum ofLow-rank matrices, and improvements in prediction accuracy over classical approaches for recommendation tasks.
The Deep Tensor Neural Network With Applications to Large Vocabulary Speech Recognition
- Computer ScienceIEEE Transactions on Audio, Speech, and Language Processing
- 2013
Evaluation on Switchboard tasks indicates that DTNNs can outperform the already high-performing DNNs with 4-5% and 3% relative word error reduction, respectively, using 30-hr and 309-hr training sets.
A Review of Relational Machine Learning for Knowledge Graphs
- Computer ScienceProceedings of the IEEE
- 2016
This paper provides a review of how statistical models can be “trained” on large knowledge graphs, and then used to predict new facts about the world (which is equivalent to predicting new edges in the graph) and how such statistical models of graphs can be combined with text-based information extraction methods for automatically constructing knowledge graphs from the Web.
Random function priors for exchangeable arrays with applications to graphs and relational data
- Computer ScienceNIPS
- 2012
A flexible yet simple Bayesian nonparametric model is obtained by placing a Gaussian process prior on the parameter function which constitutes the natural model parameter in a Bayesian model.
Restricted Boltzmann machines for collaborative filtering
- Computer ScienceICML '07
- 2007
This paper shows how a class of two-layer undirected graphical models, called Restricted Boltzmann Machines (RBM's), can be used to model tabular data, such as user's ratings of movies, and demonstrates that RBM's can be successfully applied to the Netflix data set.
Reasoning With Neural Tensor Networks for Knowledge Base Completion
- Computer ScienceNIPS
- 2013
An expressive neural tensor network suitable for reasoning over relationships between two entities given a subset of the knowledge base is introduced and performance can be improved when entities are represented as an average of their constituting word vectors.
Knowledge-Powered Deep Learning for Word Embedding
- Computer ScienceECML/PKDD
- 2014
This study explores the capacity of leveraging morphological, syntactic, and semantic knowledge to achieve high-quality word embeddings, and explores these types of knowledge to define new basis for word representation, provide additional input information, and serve as auxiliary supervision in deep learning.
Matrix Factorization Techniques for Recommender Systems
- Computer ScienceComputer
- 2009
As the Netflix Prize competition has demonstrated, matrix factorization models are superior to classic nearest neighbor techniques for producing product recommendations, allowing the incorporation of…