• Corpus ID: 17468742

“ Turn on , Tune in , Drop out ” : Anticipating student dropouts in Massive Open Online Courses

@inproceedings{Yang2013TO,
  title={“ Turn on , Tune in , Drop out ” : Anticipating student dropouts in Massive Open Online Courses},
  author={Diyi Yang and Tanmay Sinha and David Adamson and Carolyn Penstein Ros{\'e}},
  year={2013}
}
In this paper, we explore student dropout behavior in Massive Open Online Courses(MOOC). We use as a case study a recent Coursera class from which we develop a survival model that allows us to measure the influence of factors extracted from that data on student dropout rate. Specifically we explore factors related to student behavior and social positioning within discussion forums using standard social network analytic techniques. The analysis reveals several significant predictors of dropout. 
Exploring the Effect of Student Confusion in Massive Open Online Courses
TLDR
The results demonstrate that the more confusion students express themselves and the more they are exposed to other students' confusion, the sooner they drop out of the course.
Beyond Prediction: Towards Automatic Intervention in MOOC Student Stop-out
TLDR
The results suggest that surveying students based on an automatic stopout classifier achieves higher response rates compared to traditional post-course surveys, and may boost students’ propensity to “come back” into the course.
Who negatively influences me? Formalizing diffusion dynamics of negative exposure leading to student attrition in MOOCs
TLDR
Different ways in which students can be negatively exposed to their peers on MOOC forums are outlined and a simple formulation of learning network diffusion is discussed, which formalizes the essence of how such an influence spreads and can potentially lead to student attrition over time.
Exploring the Effect of Confusion in Discussion Forums of Massive Open Online Courses
TLDR
The results demonstrate that the more confusion students express or are exposed to, the lower the probability of their retention in MOOCs and implications for design of interventions towards improving the retention of students in MOocs are demonstrated.
Shared Task on Prediction of Dropout Over Time in Massively Open Online Courses
TLDR
This paper describes the task, which involved analysis of data from 6 MOOCs offered through Coursera to predict whether the student will cease to actively participate after that week of activity in a MOOC.
Peer Influence on Attrition in Massively Open Online Courses
TLDR
This work quantifies the manner in which students who demonstrate similar behavior patterns influence each other’s commitment to the course through their interaction with them either explicitly or implicitly.
Identifying At-Risk Students in Massive Open Online Courses
TLDR
This paper explores the accurate early identification of students who are at risk of not completing courses, and proposes two transfer learning algorithms to trade-off smoothness and accuracy by adding a regularization term to minimize the difference of failure probabilities between consecutive weeks.
Temporal predication of dropouts in MOOCs: Reaching the low hanging fruit through stacking generalization
TLDR
This study designs a temporal modeling approach, one which prioritizes the at-risk students in order of their likelihood to drop out of a course, and illustrates the effectiveness of an ensemble stacking generalization approach to build more robust and accurate prediction models than the direct application of base learners.
Beyond Prediction: First Steps Toward Automatic Intervention in MOOC Student Stopout
TLDR
The results suggest that surveying students based on an automatic stopout classifier achieves higher response rates compared to traditional post-course surveys, and may boost students' propensity to "come back" into the course.
Dropout prediction in MOOCs : using sentiment analysis of users' comments to predict engagement.
Massive open online course attract a huge and diverse audience, however, the dropout rate in MOOCs is very high, which concerns educationalists and course developers. This study employs sentiment
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 36 REFERENCES
Exploring Possible Reasons behind Low Student Retention Rates of Massive Online Open Courses: A Comparative Case Study from a Social Cognitive Perspective
  • Y. Wang
  • Sociology, Computer Science
    AIED Workshops
  • 2013
TLDR
Three areas, namely, the lack of self-efficacy, self-regulation, and self-motivators are identified to help present an exploratory framework in interpreting findings of this study.
Deconstructing disengagement: analyzing learner subpopulations in massive open online courses
TLDR
A simple, scalable, and informative classification method is presented that identifies a small number of longitudinal engagement trajectories in MOOCs and compares learners in each trajectory and course across demographics, forum participation, video access, and reports of overall experience.
Limits of Theory and Practice in Student Attrition
The field of student attrition has grown tremendously over the past two decades. The demographic characteristics of the population have induced us to consider how our institutions can more
MOOCs and the funnel of participation
  • D. Clow
  • Computer Science, Sociology
    LAK '13
  • 2013
TLDR
The metaphor of a 'funnel of participation' is introduced to reconceptualise the steep drop-off in activity, and the pattern of steeply unequal participation, which appear to be characteristic of MOOCs and similar learning environments.
Predicting Student Retention in Massive Open Online Courses using Hidden Markov Models
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial
Predicting Students Drop Out: A Case Study
TLDR
The experimental results show that rather simple and intuitive classifiers (decision trees) give a useful result with accuracies between 75 and 80%, and the usefulness of cost-sensitive learning and thorough analysis of misclassifications is demonstrated.
To stay or leave?: the relationship of emotional and informational support to commitment in online health support groups
TLDR
The results demonstrated that the more emotional support members were exposed to, the lower the risk of dropout, and informational support did not have the same strong effects on commitment.
Preventing Student Dropout in Distance Learning Using Machine Learning Techniques
TLDR
A number of experiments have taken place with data provided by the ‘informatics’ course of the Hellenic Open University and a quite interesting conclusion is that the Naive Bayes algorithm can be successfully used.
User interactions in social networks and their implications
TLDR
This paper proposes the use of interaction graphs to impart meaning to online social links by quantifying user interactions, and uses both types of graphs to validate two well-known social-based applications (RE and SybilGuard).
Syntactic and Functional Variability of a Million Code Submissions in a Machine Learning MOOC
TLDR
The syntax and functional similarity of the submissions are mapped out in order to explore the variation in solutions in the first offering of Stanford's Machine Learning Massive Open-Access Online Course.
...
1
2
3
4
...