• Publications
  • Influence
Densely Connected Search Space for More Flexible Neural Architecture Search
TLDR
In this paper, we propose to search block counts and block widths by designing a densely connected search space, i.e., DenseNAS. Expand
  • 32
  • 4
  • PDF
Deep Multi-instance Learning with Dynamic Pooling
TLDR
We propose a novel dynamic pooling function for multi-instance learning (MIL) that can interpret instance-to-bag relationship. Expand
  • 16
  • 4
  • PDF
Fast Neural Network Adaptation via Parameter Remapping and Architecture Search
TLDR
We propose a Fast Neural Network Adaptation (FNA) method, which can adapt both the architecture and parameters of a seed network (e.g. a high performing manually designed backbone) to become a network with different depth, width, or kernels via a Parameter Remapping technique, making it possible to utilize NAS for detection/segmentation tasks a lot more efficiently. Expand
  • 13
  • PDF
EAT-NAS: Elastic Architecture Transfer for Accelerating Large-scale Neural Architecture Search
TLDR
We propose an elastic architecture transfer mechanism for accelerating large-scale neural architecture search (EAT-NAS). Expand
  • 10
  • PDF
FNA++: Fast Network Adaptation via Parameter Remapping and Architecture Search
TLDR
We propose a Fast Network Adaptation (FNA++) method, which can adapt both the architecture and parameters of a seed network (e.g. an ImageNet pre-trained network) to become a network with different depths, widths, or kernel sizes via a parameter remapping technique, making it possible to use NAS for segmentation/detection tasks a lot more efficiently. Expand
  • 4
  • PDF
EfficientPose: Efficient Human Pose Estimation with Neural Architecture Search
TLDR
In this paper, we propose an efficient framework targeted at human pose estimation including two parts, efficient backbone and the efficient head. Expand
  • 2
  • PDF
Supplementary Material: Densely Connected Search Space for More Flexible Neural Architecture Search
For the search process, we randomly choose 100 classes from the original 1K-class ImageNet training set. We sample 20% data of each class from the above subset as the validation set. The originalExpand
ResizeMix: Mixing Data with Preserved Object Information and True Labels
TLDR
Data augmentation is a powerful technique to increase the diversity of data, which can effectively improve the generalization ability of neural networks in image recognition tasks. Expand
  • 1
  • PDF