Papers
Topics
Authors
Recent
Search
2000 character limit reached

Stealthy Low-frequency Backdoor Attack against Deep Neural Networks

Published 10 May 2023 in cs.CR | (2305.09677v1)

Abstract: Deep neural networks (DNNs) have gain its popularity in various scenarios in recent years. However, its excellent ability of fitting complex functions also makes it vulnerable to backdoor attacks. Specifically, a backdoor can remain hidden indefinitely until activated by a sample with a specific trigger, which is hugely concealed. Nevertheless, existing backdoor attacks operate backdoors in spatial domain, i.e., the poisoned images are generated by adding additional perturbations to the original images, which are easy to detect. To bring the potential of backdoor attacks into full play, we propose low-pass attack, a novel attack scheme that utilizes low-pass filter to inject backdoor in frequency domain. Unlike traditional poisoned image generation methods, our approach reduces high-frequency components and preserve original images' semantic information instead of adding additional perturbations, improving the capability of evading current defenses. Besides, we introduce "precision mode" to make our backdoor triggered at a specified level of filtering, which further improves stealthiness. We evaluate our low-pass attack on four datasets and demonstrate that even under pollution rate of 0.01, we can perform stealthy attack without trading off attack performance. Besides, our backdoor attack can successfully bypass state-of-the-art defending mechanisms. We also compare our attack with existing backdoor attacks and show that our poisoned images are nearly invisible and retain higher image quality.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.