• Publications
  • Influence
Densely Connected Convolutional Networks
TLDR
The Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion, and has several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. Expand
Learning Efficient Convolutional Networks through Network Slimming
TLDR
The approach is called network slimming, which takes wide and large networks as input models, but during training insignificant channels are automatically identified and pruned afterwards, yielding thin and compact models with comparable accuracy. Expand
Deep Networks with Stochastic Depth
TLDR
Stochastic depth is proposed, a training procedure that enables the seemingly contradictory setup to train short networks and use deep networks at test time and reduces training time substantially and improves the test error significantly on almost all data sets that were used for evaluation. Expand
Rethinking the Value of Network Pruning
TLDR
It is found that with optimal learning rate, the "winning ticket" initialization as used in Frankle & Carbin (2019) does not bring improvement over random initialization, and the need for more careful baseline evaluations in future research on structured pruning methods is suggested. Expand
Snapshot Ensembles: Train 1, get M for free
TLDR
This paper proposes a method to obtain the seemingly contradictory goal of ensembling multiple neural networks at no additional training cost by training a single neural network, converging to several local minima along its optimization path and saving the model parameters. Expand
Few-Shot Object Detection via Feature Reweighting
TLDR
This work develops a few-shot object detector that can learn to detect novel objects from only a few annotated examples, using a meta feature learner and a reweighting module within a one-stage detection architecture. Expand
DSOD: Learning Deeply Supervised Object Detectors from Scratch
TLDR
Deeply Supervised Object Detector (DSOD), a framework that can learn object detectors from scratch following the single-shot detection (SSD) framework, and one of the key findings is that deep supervision, enabled by dense layer-wise connections, plays a critical role in learning a good detector. Expand
A New Meta-Baseline for Few-Shot Learning
TLDR
This work presents a Meta-Baseline method, by pre-training a classifier on all base classes and meta-learning on a nearest-centroid based few-shot classification algorithm, which outperforms recent state-of-the-art methods by a large margin. Expand
Test-Time Training with Self-Supervision for Generalization under Distribution Shifts
TLDR
This work turns a single unlabeled test sample into a self-supervised learning problem, on which the model parameters are updated before making a prediction, which leads to improvements on diverse image classification benchmarks aimed at evaluating robustness to distribution shifts. Expand
Convolutional Networks with Dense Connectivity
TLDR
The Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion, and has several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially improve parameter efficiency. Expand
...
1
2
3
...