• Corpus ID: 238253049

Iterative Teacher-Aware Learning

@article{Yuan2021IterativeTL,
  title={Iterative Teacher-Aware Learning},
  author={Luyao Yuan and Dongruo Zhou and Junhong Shen and Jingdong Gao and Jeffrey L. Chen and Quanquan Gu and Ying Nian Wu and Song-Chun Zhu},
  journal={ArXiv},
  year={2021},
  volume={abs/2110.00137}
}
In human pedagogy, teachers and students can interact adaptively to maximize communication efficiency. The teacher adjusts her teaching method for different students, and the student, after getting familiar with the teacher’s instruction mechanism, can infer the teacher’s intention to learn faster. Recently, the benefits of integrating this cooperative pedagogy into machine concept learning in discrete spaces have been proved by multiple works. However, how cooperative pedagogy can facilitate… 

References

SHOWING 1-10 OF 77 REFERENCES
Learning to Teach with Dynamic Loss Functions
TLDR
An efficient learning method for the teacher model is developed that makes gradient based optimization possible, exempt of the ineffective solutions such as policy optimization and extensive experiments demonstrate that the method significantly improves the quality of various student models.
Understanding the Role of Adaptivity in Machine Teaching: The Case of Version Space Learners
TLDR
Inspired by human teaching, a new model where the learner picks hypotheses according to some local preference defined by the current hypothesis is proposed, and it is shown that the model exhibits several desirable properties, e.g., adaptivity plays a key role, and theLearner’s transitions over hypotheses are smooth/interpretable.
Interactive Optimal Teaching with Unknown Learners
TLDR
This paper introduces a new approach for machine teaching that partly addresses the (unavoidable) mismatch between what the teacher assumes about the learning process of the student and the actual process, and introduces interactivity as a means to mitigate the impact of imperfect knowledge.
Models of Cooperative Teaching and Learning
TLDR
Two models of learning from a cooperative teacher that selects "helpful" training examples are introduced, and a new notion of "coding trick"/"collusion" is introduced that can be arbitrarily lower than the classic teaching dimension and known variants thereof, without using coding tricks.
Machine Teaching of Active Sequential Learners
TLDR
The approach gives tools to taking into account strategic (planning) behaviour of users of interactive intelligent systems, such as recommendation engines, by considering them as boundedly optimal teachers.
Iterative Machine Teaching
TLDR
This paper studies a new paradigm where the learner uses an iterative algorithm and a teacher can feed examples sequentially and intelligently based on the current performance of the learNER.
Teaching Inverse Reinforcement Learners via Features and Demonstrations
TLDR
A teaching scheme is suggested in which the expert can decrease the teaching risk by updating the learner's worldview, and thus ultimately enable her to find a near-optimal policy.
Towards Black-box Iterative Machine Teaching
TLDR
This paper proposes an active teacher model that can actively query the learner for estimating theLearner's status, and provide the sample complexity for both teaching and query, respectively.
Interpretable and Pedagogical Examples
TLDR
This work shows that the teacher network learns to select or generate interpretable, pedagogical examples to teach rule-based, probabilistic, boolean, and hierarchical concepts.
Near-Optimal Machine Teaching via Explanatory Teaching Sets
TLDR
This paper proposes NOTES, a principled framework for constructing interpretable teaching sets, utilizing explanations to accelerate the teaching process, and proves that NOTES is competitive with the optimal explanation-based teaching strategy.
...
1
2
3
4
5
...