• Publications
  • Influence
Collaborative Distillation for Ultra-Resolution Universal Style Transfer
TLDR
A new knowledge distillation method for encoder-decoder based neural style transfer to reduce the convolutional filters and achieves ultra-resolution (over 40 megapixels) universal style transfer on a 12GB GPU for the first time.
Structured Probabilistic Pruning for Convolutional Neural Network Acceleration
TLDR
A novel progressive parameter pruning method, named Structured Probabilistic Pruning (SPP), which effectively prunes weights of convolutional layers in a probabilistic manner and can be directly applied to accelerate multi-branch CNN networks, such as ResNet, without specific adaptations.
Structured Pruning for Efficient ConvNets via Incremental Regularization
TLDR
A new and novel regularization-based pruning method, named IncReg, to incrementally assign different regularization factors to different weights based on their relative importance is proposed, which achieves comparable to even better results compared with state-of-the-arts.
Triplet Distillation For Deep Face Recognition
TLDR
This work proposes an enhanced version of triplet loss, named triplet distillation, which exploits the capability of a teacher model to transfer the similarity information to a student model by adaptively varying the margin between positive and negative pairs.
Three-Dimensional Convolutional Neural Network Pruning with Regularization-Based Method
TLDR
A three-dimensional regularization-based neural network pruning method to assign different regularization parameters to different weight groups based on their importance to the network based on the redundancy and computation cost for each layer.
Structured Pruning for Efficient Convolutional Neural Networks via Incremental Regularization
TLDR
This work proposes a novel regularization-based pruning method, named IncReg, to incrementally assign different regularization factors to different weights based on their relative importance, which can equip existing structured pruning methods for further acceleration with ignorable accuracy loss.
Concise Convolutional Neural Network for Crowd Counting
TLDR
A concise and effective CNN model with only five convolutional layers that has great transfer learning performance and real-time performance and is based on geometry-adaptive kernels which can alleviate the perspective problems.
Supplementary Material: Aligned Structured Sparsity Learning for Efficient Image Super-Resolution
TLDR
The spontaneous reaction of the kept WN scale parameters prevents the network from catastrophic expressivity damage, hence the performance superiority of the ASSL method against the counterparts.
R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
TLDR
Recent research explosion on Neural Radiance Field shows the encouraging potential to represent complex scenes with neural networks, but its prohibitive inference time is a major drawback.
Tensor neural networks via circulant convolution
...
...