R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning

@article{Li2020RVGAERG,
  title={R-VGAE: Relational-variational Graph Autoencoder for Unsupervised Prerequisite Chain Learning},
  author={Irene Z Li and Alexander R. Fabbri and Swapnil Hingmire and Dragomir Radev},
  journal={ArXiv},
  year={2020},
  volume={abs/2004.10610}
}
The task of concept prerequisite chain learning is to automatically determine the existence of prerequisite relationships among concept pairs. In this paper, we frame learning prerequisite relationships among concepts as an unsupervised task with no access to labeled concept pairs during training. We propose a model called the Relational-Variational Graph AutoEncoder (R-VGAE) to predict concept relations within a graph consisting of concept and resource nodes. Results show that our unsupervised… Expand
Unsupervised Cross-Domain Prerequisite Chain Learning using Variational Graph Autoencoders
TLDR
This paper proposes unsupervised cross-domain concept prerequisite chain learning using an optimized variational graph autoencoder and expands an existing dataset by introducing two new domains––CV and Bioinformatics (BIO). Expand
Conditional Link Prediction of Category-Implicit Keypoint Detection
TLDR
An end-to-end category-implicit Keypoint and Link Prediction Network (KLPNet), which is the first approach for simultaneous semantic keypoint detection (for multi-class instances) and CL rejuvenation and the experimental results of CL prediction show the effectiveness of the KLPNet with respect to occlusion problems. Expand
Conditional Link Prediction of Category-Implicit Keypoint Detection
TLDR
An end-to-end category-implicit Keypoint and Link Prediction Network (KLPNet), which is the first approach for simultaneous semantic keypoint detection (for multi-class instances) and CL rejuvenation and the experimental results of CL prediction show the effectiveness of the KLPNet with respect to occlusion problems. Expand
Heterogeneous Graph Neural Networks for Multi-label Text Classification
TLDR
A heterogeneousgraph convolutional network model is proposed to solve the MLTC problem by modeling tokens and labels as nodes in a heterogeneous graph to take into account multiple relationships including token-level relationships. Expand
AAN: Developing Educational Tools for Work Force Training
Millions of computing jobs remain unfilled, largely due to an insufficiently trained workforce which cannot keep up with the latest technological advances. We aim to address this problem byExpand

References

SHOWING 1-10 OF 32 REFERENCES
Modeling Relational Data with Graph Convolutional Networks
TLDR
It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline. Expand
Variational Graph Auto-Encoders
TLDR
The variational graph auto-encoder (VGAE) is introduced, a framework for unsupervised learning on graph-structured data based on the variational auto- Encoder (VAE) that can naturally incorporate node features, which significantly improves predictive performance on a number of benchmark datasets. Expand
Embedding Entities and Relations for Learning and Inference in Knowledge Bases
TLDR
It is found that embeddings learned from the bilinear objective are particularly good at capturing relational semantics and that the composition of relations is characterized by matrix multiplication. Expand
What Should I Learn First: Introducing LectureBank for NLP Education and Prerequisite Chain Learning
TLDR
LectureBank is introduced, a dataset containing 1,352 English lecture files collected from university courses which are each classified according to an existing taxonomy as well as 208 manually-labeled prerequisite relation topics, which is publicly available. Expand
node2vec: Scalable Feature Learning for Networks
TLDR
In node2vec, an algorithmic framework for learning continuous feature representations for nodes in networks, a flexible notion of a node's network neighborhood is defined and a biased random walk procedure is designed, which efficiently explores diverse neighborhoods. Expand
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
TLDR
A new language representation model, BERT, designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers, which can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks. Expand
Learning Concept Graphs from Online Educational Data
TLDR
This paper addresses an open challenge in educational data mining, i.e., the problem of automatically mapping online courses from different providers onto a universal space of concepts, and predicting latent prerequisite dependencies among both concepts and courses, with a novel approach for inference within and across course-level and concept-level directed graphs. Expand
Learning Transferable Features For Open-Domain Question Answering
TLDR
It is found that domainadaptation greatly improves sentence-level QA performance, and span- level QA benefits from sentence information, and a simple clustering algorithm may be employed when the topic domains are unknown and the resulting loss in accuracy is negligible. Expand
Semi-Supervised Techniques for Mining Learning Outcomes and Prerequisites
TLDR
A novel approach is proposed that leverages textbooks as a source of distant supervision, but learns a model that can generalize to arbitrary documents (such as those on the web) that can take advantage of any existing textbook, without requiring expert annotation. Expand
Investigating Active Learning for Concept Prerequisite Learning
TLDR
Experimental results for domains including data mining, geometry, physics, and precalculus show that active learning can be used to reduce the amount of training data required. Expand
...
1
2
3
4
...