Variational Quantum Circuit Model for Knowledge Graph Embedding

@article{Ma2019VariationalQC,
  title={Variational Quantum Circuit Model for Knowledge Graph Embedding},
  author={Yunpu Ma and Volker Tresp and Liming Zhao and Yuyi Wang},
  journal={Advanced Quantum Technologies},
  year={2019},
  volume={2}
}
In this work, the first quantum Ansätze for the statistical relational learning on knowledge graphs using parametric quantum circuits are proposed. Two types of variational quantum circuits for knowledge graph embedding are introduced. Inspired by the classical representation learning, latent features for entities are first considered as coefficients of quantum states, while predicates are characterized by parametric gates acting on the quantum states. For the first model, the quantum… 
Hybrid Quantum-Classical Graph Convolutional Network
TLDR
This research provides a hybrid quantum-classical graph convolutional network (QGCNN) for learning HEP data that demonstrates an advantage over classical multilayer perceptron and convolutionAL neural networks in the aspect of number of parameters.
An Empirical Study of Optimizers for Quantum Machine Learning
TLDR
This paper takes nine widely employed optimizers, including gradient based and gradient-free based, into account, and presents the empirical comparison between their performance on a typical scenario, i.e. supervised learning, and found the gradient based optimizer provides a relatively better solution in each case.
Foundations for Near-Term Quantum Natural Language Processing
TLDR
The encoding of linguistic structure within quantum circuits also embodies a novel approach for establishing word-meanings that goes beyond the current standards in mainstream AI, by placing linguistic structure at the heart of Wittgenstein's meaning-is-context.
Quantum Machine Learning and Bioinspired Quantum Technologies
There is no doubt artificial intelligence and machine learning have been significantly modifying the scientific and technological landscape, as well as society at large, over the last decade. Search
Variational Quanvolutional Neural Networks with enhanced image encoding
TLDR
There is not one best image encoding, but that the choice of the encoding depends on the specific constraints of the application, and the experiments show that some image encodings are better suited for variational circuits.
How to make qubits speak
TLDR
This is a story about making quantum computers speak, and doing so in a quantumnative, compositional and meaning-aware manner, and provides the reader with some indications of that broader pictorial landscape, including the authors' account on the notion of compositionality.

References

SHOWING 1-10 OF 58 REFERENCES
Quantum Algorithms for Linear Algebra and Machine Learning.
TLDR
This dissertation makes progress on all three aspects of the quantum machine learning problem and obtain quantum algorithms for low rank approximation and regularized least squares and quadratic speedups for a large class of linear algebra algorithms that rely on importance sampling from the leverage score distribution.
The quest for a Quantum Neural Network
TLDR
This article presents a systematic approach to QNN research, concentrating on Hopfield-type networks and the task of associative memory, and outlines the challenge of combining the nonlinear, dissipative dynamics of neural computing and the linear, unitary dynamics of quantum computing.
A Quantum Approximate Optimization Algorithm
TLDR
A quantum algorithm that produces approximate solutions for combinatorial optimization problems that depends on a positive integer p and the quality of the approximation improves as p is increased, and is studied as applied to MaxCut on regular graphs.
Universal quantum perceptron as efficient unitary approximators
TLDR
It is demonstrated that it is possible to implement a quantum perceptron with a sigmoid activation function as an efficient, reversible many-body unitary operation, and it is proved that such a quantum neural network is a universal approximator of continuous functions, with the same power as classical neural networks.
Holographic Embeddings of Knowledge Graphs
TLDR
Holographic embeddings are proposed to learn compositional vector space representations of entire knowledge graphs to outperform state-of-the-art methods for link prediction on knowledge graphs and relational learning benchmark datasets.
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication.
Convolutional 2D Knowledge Graph Embeddings
TLDR
ConvE, a multi-layer convolutional network model for link prediction, is introduced and it is found that ConvE achieves state-of-the-art Mean Reciprocal Rank across most datasets.
Holistic Representations for Memorization and Inference
TLDR
This paper introduces a novel holographic memory model for the distributed storage of complex association patterns and shows that pairwise quasi-orthogonality can be improved by drawing vectors from heavy-tailed distributions, e.g., a Cauchy distribution, and memory capacity of holistic representations can significantly be improved.
Complex Embeddings for Simple Link Prediction
TLDR
This work makes use of complex valued embeddings to solve the link prediction problem through latent factorization, and uses the Hermitian dot product, the complex counterpart of the standard dot product between real vectors.
Strengths and Weaknesses of Quantum Computing
TLDR
It is proved that relative to an oracle chosen uniformly at random with probability 1 the class $\NP$ cannot be solved on a quantum Turing machine (QTM) in time $o(2^{n/2})$.
...
1
2
3
4
5
...