• Publications
  • Influence
Switching Convolutional Neural Network for Crowd Counting
TLDR
A novel crowd counting model that maps a given crowd scene to its density and switch convolutional neural network that leverages variation of crowd density within an image to improve the accuracy and localization of the predicted crowd count is proposed. Expand
SeamSeg: Video Object Segmentation Using Patch Seams
TLDR
The proposed approach for video object segmentation using patch seams across frames out-perform state-of-the-art supervised and unsupervised algorithms, on benchmark datasets. Expand
DeepFix: A Fully Convolutional Neural Network for Predicting Human Eye Fixations
TLDR
This paper proposes DeepFix, a fully convolutional neural network, which models the bottom–up mechanism of visual attention via saliency prediction via Saliency prediction, and evaluates the model on multiple challenging saliency data sets and shows that it achieves the state-of-the-art results. Expand
CrowdNet: A Deep Convolutional Network for Dense Crowd Counting
TLDR
This work uses a combination of deep and shallow, fully convolutional networks to predict the density map for a given crowd image, and shows that this combination is used for effectively capturing both the high-level semantic information and the low-level features, necessary for crowd counting under large scale variations. Expand
DeepFuse: A Deep Unsupervised Approach for Exposure Fusion with Extreme Exposure Image Pairs
TLDR
An unsupervised deep learning framework for MEF utilizing a no-reference quality metric as loss function utilizing a novel CNN architecture trained to learn the fusion operation without reference ground truth image is proposed. Expand
Fast Feature Fool: A data independent approach to universal adversarial perturbations
TLDR
This paper proposes a novel data independent approach to generate image agnostic perturbations for a range of CNNs trained for object recognition and shows that these perturbation are transferable across multiple network architectures trained either on same or different data. Expand
Data-free Parameter Pruning for Deep Neural Networks
TLDR
It is shown how similar neurons are redundant, and a systematic way to remove them is proposed, which can be applied on top of most networks with a fully connected layer to give a smaller network. Expand
Generalizable Data-Free Objective for Crafting Universal Adversarial Perturbations
TLDR
This paper presents a novel, generalizable and data-free approach for crafting universal adversarial perturbations, and shows that the current deep learning models are now at an increased risk, since the objective generalizes across multiple tasks without the requirement of training data for crafting the perturbation. Expand
Divide and Grow: Capturing Huge Diversity in Crowd Images with Incrementally Growing CNN
TLDR
A growing CNN which can progressively increase its capacity to account for the wide variability seen in crowd scenes is tackled, which achieves higher count accuracy on major crowd datasets and analyses the characteristics of specialties mined automatically by the proposed model. Expand
Zero-Shot Knowledge Distillation in Deep Networks
TLDR
This paper synthesizes the Data Impressions from the complex Teacher model and utilize these as surrogates for the original training data samples to transfer its learning to Student via knowledge distillation, and shows that this framework results in competitive generalization performance as achieved by distillation using the actualTraining data samples on multiple benchmark datasets. Expand
...
1
2
3
4
5
...