• Publications
  • Influence
Progressive Differentiable Architecture Search: Bridging the Depth Gap Between Search and Evaluation
TLDR
This paper presents an efficient algorithm which allows the depth of searched architectures to grow gradually during the training procedure, and solves two issues, namely, heavier computational overheads and weaker search stability, which are solved using search space approximation and regularization. Expand
PC-DARTS: Partial Channel Connections for Memory-Efficient Architecture Search
TLDR
This paper presents a novel approach, namely Partially-Connected DARTS, by sampling a small part of super-net to reduce the redundancy in exploring the network space, thereby performing a more efficient search without comprising the performance. Expand
Actional-Structural Graph Convolutional Networks for Skeleton-Based Action Recognition
TLDR
The proposed AS-GCN achieves consistently large improvement compared to the state-of-the-art methods and shows promising results for future pose prediction. Expand
Picking Deep Filter Responses for Fine-Grained Image Recognition
TLDR
This paper proposes an automatic fine-grained recognition approach which is free of any object / part annotation at both training and testing stages, and conditionally pick deep filter responses to encode them into the final representation, which considers the importance of filter responses themselves. Expand
PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search
TLDR
This paper presents a novel approach, namely, Partially-Connected DARTS, by sampling a small part of super-network to reduce the redundancy in exploring the network space, thereby performing a more efficient search without comprising the performance. Expand
DisturbLabel: Regularizing CNN on the Loss Layer
TLDR
An extremely simple algorithm which randomly replaces a part of labels as incorrect values in each iteration, which prevents the network training from over-fitting by implicitly averaging over exponentially many networks which are trained with different label sets. Expand
Variational Convolutional Neural Network Pruning
TLDR
Variational technique is introduced to estimate distribution of a newly proposed parameter, called channel saliency, based on which redundant channels can be removed from model via a simple criterion, and results in significant size reduction and computation saving. Expand
Unsupervised Person Re-Identification via Softened Similarity Learning
TLDR
The iterative training mechanism is followed but clustering is discarded, since it incurs loss from hard quantization, yet its only product, image-level similarity, can be easily replaced by pairwise computation and a softened classification task. Expand
Adversarial Training Towards Robust Multimedia Recommender System
TLDR
This paper proposes a novel solution named Adversarial Multimedia Recommendation (AMR), which can lead to a more robust multimedia recommender model by using adversarial learning, to train the model to defend an adversary, which adds perturbations to the target image with the purpose of decreasing the model's accuracy. Expand
Learning Channel-Wise Interactions for Binary Convolutional Neural Networks
TLDR
Extensive experiments show that the CI-BCNN outperforms the state-of-the-art binary convolutional neural networks with less computational and storage cost and imposes channel-wise priors on the intermediate feature maps through the interacted bitcount function. Expand
...
1
2
3
4
5
...