• Corpus ID: 218900595

VarFA: A Variational Factor Analysis Framework For Efficient Bayesian Learning Analytics

@article{Wang2020VarFAAV,
  title={VarFA: A Variational Factor Analysis Framework For Efficient Bayesian Learning Analytics},
  author={Zichao Wang and Yi Gu and Andrew S. Lan and Richard Baraniuk},
  journal={ArXiv},
  year={2020},
  volume={abs/2005.13107}
}
We propose VarFA, a variational inference factor analysis framework that extends existing factor analysis models for educational data mining to efficiently output uncertainty estimation in the model's estimated factors. Such uncertainty information is useful, for example, for an adaptive testing scenario, where additional tests can be administered if the model is not quite certain about a students' skill level estimation. Traditional Bayesian inference methods that produce such uncertainty… 

Figures and Tables from this paper

Using Knowledge Concept Aggregation towards Accurate Cognitive Diagnosis

TLDR
CDGK, a model based artificial neural network to deal with cognitive diagnosis that captures non-linear interactions between exercise features, student scores, and their mastery on each knowledge concept, but also performs an aggregation of the knowledge concepts via converting them into graph structure, and only considering the leaf node in the knowledge concept tree.

Knowledge structure enhanced graph representation learning model for attentive knowledge tracing

TLDR
This paper proposes a novel KS‐enhanced graph representation learning model for KT with an attention mechanism (KSGKT), and explores eight methods that automatically infer the domain KS from learner response data and integrate it into the KT procedure.

References

SHOWING 1-10 OF 81 REFERENCES

Variational Inference for Bayesian Mixtures of Factor Analysers

TLDR
An algorithm is presented that infers the model structure of a mixture of factor analysers using an efficient and deterministic variational approximation to full Bayesian integration over model parameters and shows how to obtain unbiased estimates of the true evidence, the exact predictive density, and the KL divergence between the variational posterior and the true posterior.

A note on variational Bayesian factor analysis

Auto-Encoding Variational Bayes

TLDR
A stochastic variational inference and learning algorithm that scales to large datasets and, under some mild differentiability conditions, even works in the intractable case is introduced.

A Comparison of Imputation Methods for Bayesian Factor Analysis Models

Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is

Icebreaker: Element-wise Efficient Information Acquisition with a Bayesian Deep Latent Gaussian Model

TLDR
This paper proposes Icebreaker, a principled framework for elementwise training data acquisition, which introduces a full Bayesian Deep Latent Gaussian Model (BELGAM) with a novel inference method, which combines recent advances in amortized inference and stochastic gradient MCMC to enable fast and accurate posterior inference.

Bayesian estimation of a multilevel IRT model using gibbs sampling

TLDR
A two-level regression model is imposed on the ability parameters in an item response theory (IRT) model and it will be shown that the parameters of the two-parameter normal ogive model and the multilevel model can be estimated in a Bayesian framework using Gibbs sampling.

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and

Variational Autoencoders for Collaborative Filtering

TLDR
A generative model with multinomial likelihood and use Bayesian inference for parameter estimation is introduced and the pros and cons of employing a principledBayesian inference approach are identified and characterize settings where it provides the most significant improvements.

Back to the basics: Bayesian extensions of IRT outperform neural networks for proficiency estimation

TLDR
It is found that IRT-based methods consistently matched or outperformed DKT across all data sets at the finest level of content granularity that was tractable for them to be trained on.

Variational Item Response Theory: Fast, Accurate, and Expressive

TLDR
A variational Bayesian inference algorithm is introduced for IRT, and it is shown that it is fast and scaleable without sacrificing accuracy, and extended classic IRT with expressive Bayesian models of responses.
...