Bridging the Gap: Balancing Human Perception and Detector Attention in Adversarial Attacks

Abstract

Adversarial attacks on deep neural networks have garnered significant attention, with recent studies focusing on generating intricate, colorful patterns designed to mislead models. However, such patterns are often easily discernible to human observers. To address this limitation, we propose a noval adversarial camouflage framework BHD that simultaneously mitigates human perceptibility and reduces detector attention. Our approach defines a base pattern by leveraging the surroundings of target object, making it less conspicuous to the human eye. We further design a loss function, integrated with a pre-trained vision encoder with fine-tuned projector, to optimize the adversarial pattern. This allows for effective attacks on detectors while ensuring robust camouflage across diverse environments. In addition, we incorporate human-aligned large multimodal models as objective metrics to quantify the impact on human perception. Extensive experimental results demonstrate that our framework achieves a superior balance between human imperceptibility and model deception, outperforming state-of-the-art methods.

Publication
2025 IEEE International Conference on Multimedia and Expo
Mingye Xie
Mingye Xie
PhD Candidate

Life itself is the most wonderful fairy tale.