• Corpus ID: 135464409

Sparse Neural Attentive Knowledge-based Models for Grade Prediction

  title={Sparse Neural Attentive Knowledge-based Models for Grade Prediction},
  author={Sara Morsy and George Karypis},
Grade prediction for future courses not yet taken by students is important as it can help them and their advisers during the process of course selection as well as for designing personalized degree plans and modifying them based on their performance. One of the successful approaches for accurately predicting a student's grades in future courses is Cumulative Knowledge-based Regression Models (CKRM). CKRM learns shallow linear models that predict a student's grades as the similarity between his… 
Predicting Students’ Academic Performance: A Review for the Attribute Used
This article is a review on attributes, models and tools used to solve the problem of predicting students' academic performance. Based on the reference throughout 2009, the attributes used include
Prescribing Deep Attentive Score Prediction Attracts Improved Student Engagement
It is demonstrated that the accuracy of the score prediction model deployed in a real-world setting significantly impacts user engagement by providing empirical evidence and is applied to Santa, a multi-platform English ITS that exclusively focuses on the TOEIC standardized examinations.


Cumulative Knowledge-based Regression Models for Next-term Grade Prediction
A cumulative knowledge-based regression model with different courseknowledge spaces for the task of next-term grade prediction that utilizes historical student-course grades as well as the information available about the courses to capture the relationships between courses in terms of the knowledge components provided by them.
ALE: Additive Latent Effect Models for Grade Prediction
The experimental results demonstrate that the proposed additive latent effect models significantly outperform the baselines on grade prediction problem and perform a thorough analysis on the importance of different factors and how they can practically assist students in course selection, and finally improve their academic performance.
Grade Prediction with Course and Student Specific Models
These methods identify the predictive subsets of prior courses on a course-by-course basis and better address problems associated with the not-missing-at-random nature of the student-course historical grade data.
Grade Prediction with Temporal Course-wise Influence
A factorization-based approach called Matrix Factorization with Temporal Course-wise Influence that incorporates course-wise influence effects and temporal effects for grade prediction that outperforms several baseline approaches and infer meaningful patterns between pairs of courses within academic programs.
Course-Specific Markovian Models for Grade Prediction
This paper developed course-specific Hidden Markov Models and Hidden Semi-markov Models for the problem of next-term grade prediction and shows by a case study the application of these methods for early identification of at-risk students.
Next-Term Student Performance Prediction: A Recommender Systems Approach
A system to predict students’ grades in the courses they will enroll in during the next enrollment term by learning patterns from historical transcript data coupled with additional information about students, courses and the instructors teaching them is developed.
Domain-Aware Grade Prediction and Top-n Course Recommendation
This work investigates how the student and course academic features influence the enrollment patterns and uses these features to defineStudent and course groups at various levels of granularity, and shows how these groups can be used to design grade prediction and top-n course ranking models for neighborhood-based user collaborative filtering, matrix factorization and popularity-based ranking approaches.
NAIS: Neural Attentive Item Similarity Model for Recommendation
This work proposes a neural network model named Neural Attentive Item Similarity model (NAIS), which is the first attempt that designs neural network models for item-based CF, opening up new research possibilities for future developments of neural recommender systems.
Attentional Factorization Machines: Learning the Weight of Feature Interactions via Attention Networks
A novel model named Attentional Factorization Machine (AFM), which learns the importance of each feature interaction from data via a neural attention network, which consistently outperforms the state-of-the-art deep learning methods Wide&Deep and DeepCross with a much simpler structure and fewer model parameters.
Attention is All you Need
A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.