Physically Realizable Adversarial Examples for LiDAR Object Detection

@article{Tu2020PhysicallyRA,
  title={Physically Realizable Adversarial Examples for LiDAR Object Detection},
  author={J. Tu and Mengye Ren and Siva Manivasagam and Ming Liang and Bin Yang and R. Du and Frank Cheng and R. Urtasun},
  journal={2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
  year={2020},
  pages={13713-13722}
}
  • J. Tu, Mengye Ren, +5 authors R. Urtasun
  • Published 2020
  • Computer Science
  • 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
  • Modern autonomous driving systems rely heavily on deep learning models to process point cloud sensory data; meanwhile, deep models have been shown to be susceptible to adversarial attacks with visually imperceptible perturbations. Despite the fact that this poses a security concern for the self-driving industry, there has been very little exploration in terms of 3D perception, as most adversarial attacks have only been applied to 2D flat images. In this paper, we address this issue and present… CONTINUE READING

    References

    Publications referenced by this paper.
    SHOWING 1-10 OF 51 REFERENCES
    GABAA Receptor-Mediated Tonic Inhibition in Thalamic Neurons
    267
    Value of SOX10 Immunostaining in Tumor Diagnosis
    25
    Artificial neural network technique for rainfall forecasting applied to the São Paulo region
    234
    Ethical considerations for neuropsychologists as functional magnetic imagers
    24
    Selection algorithms for replicated Web servers
    184