Distilling Model Knowledge

Abstract

Top-performing machine learning systems, such as deep neural networks, large ensembles and complex probabilistic graphical models, can be expensive to store, slow to evaluate and hard to integrate into larger systems. Ideally, we would like to replace such cumbersome models with simpler models that perform equally well. In this thesis, we study knowledge… (More)

Topics

Figures and Tables

Sorry, we couldn't extract any figures or tables for this paper.

Statistics

02040201520162017
Citations per Year

Citation Velocity: 6

Averaging 6 citations per year over the last 3 years.

Learn more about how we calculate this metric in our FAQ.

Slides referencing similar topics