Piggyback: Adding Multiple Tasks to a Single, Fixed Network by Learning to Mask

  title={Piggyback: Adding Multiple Tasks to a Single, Fixed Network by Learning to Mask},
  author={Arun Mallya and Svetlana Lazebnik},
This work presents a method for adding multiple tasks to a single, fixed deep neural network without affecting performance on already learned tasks. By building upon concepts from network quantization and sparsification, we learn binary masks that “piggyback”, or are applied to an existing network to provide good performance on a new task. These masks are learned in an end-to-end differentiable fashion, and incur a low overhead of 1 bit per network parameter, per task. Even though the… CONTINUE READING
Related Discussions
This paper has been referenced on Twitter 42 times. VIEW TWEETS


Publications referenced by this paper.
Showing 1-10 of 43 references

Places: A 10 Million Image Database for Scene Recognition

IEEE Transactions on Pattern Analysis and Machine Intelligence • 2018
View 16 Excerpts
Highly Influenced

and M

M. Eitz, J. Hays
Alexa. How do humans sketch objects? In SIGGRAPH • 2012
View 8 Excerpts
Highly Influenced

Automated Flower Classification over a Large Number of Classes

2008 Sixth Indian Conference on Computer Vision, Graphics & Image Processing • 2008
View 7 Excerpts
Highly Influenced

et al

J. Kirkpatrick, R. Pascanu, +7 authors A. Grabska-Barwinska
Overcoming catastrophic forgetting in neural networks. PNAS • 2017
View 4 Excerpts
Highly Influenced

Learning Without Forgetting

ECCV • 2016
View 4 Excerpts
Highly Influenced

Similar Papers

Loading similar papers…