Obj2Sub: Unsupervised Conversion of Objective to Subjective Questions

@inproceedings{Chhabra2022Obj2SubUC,
  title={Obj2Sub: Unsupervised Conversion of Objective to Subjective Questions},
  author={Aarish Chhabra and Nandini Bansal and V. Venktesh and Mukesh K. Mohania and Deep Dwivedi},
  booktitle={AIED},
  year={2022}
}
. Exams are conducted to test the learner’s understanding of the subject. To prevent the learners from guessing or exchanging so-lutions, the mode of tests administered must have sufficient subjective questions that can gauge whether the learner has understood the concept by mandating a detailed answer. Hence, in this paper, we propose a novel hybrid unsupervised approach leveraging rule based methods and pre-trained dense retrievers for the novel task of automatically converting the objective… 

Figures and Tables from this paper

References

SHOWING 1-7 OF 7 REFERENCES

Good Question! Statistical Ranking for Question Generation

This work uses manually written rules to perform a sequence of general purpose syntactic transformations to turn declarative sentences into questions, which are ranked by a logistic regression model trained on a small, tailored dataset consisting of labeled output from the system.

Improving Neural Question Generation using Answer Separation

The answer separation method significantly reduces the number of improper questions which include answers and a new module termed keyword-net is proposed, which helps the model better capture the key information in the target answer and generate an appropriate question.

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

This systematic study compares pre-training objectives, architectures, unlabeled datasets, transfer approaches, and other factors on dozens of language understanding tasks and achieves state-of-the-art results on many benchmarks covering summarization, question answering, text classification, and more.

Learning to Generate Questions with Adaptive Copying Neural Networks

The proposed model adds a copying mechanism component onto a bidirectional LSTM architecture to generate more suitable questions adaptively from the input data to outperform the state-of-the-art question generation methods.

Attention is All you Need

A new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely is proposed, which generalizes well to other tasks by applying it successfully to English constituency parsing both with large and limited training data.

Q.V.: Sequence to sequence learning with neural networks

  • 2014