Intentionally Sabotaging Facial Recognition AI

Researchers at the University of Chicago recently published (1MB PDF) a paper outlining a novel attack against facial recognition systems, which they call Fawkes.

The general idea is that some facial recognition systems, such as Clearview.ai, scour the Internet for upload images and use those to generate facial recognition profiles for millions of users. If I upload a picture of myself to Facebook, for example, a facial recognition system could potentially associate that picture with my identity and use it to distinguish my face from others more accurately.

But when using data gathered from public sources, there is always the possibility of intentionally creating poisoned data designed to reduce a system’s effectiveness. In this case, the Fawkes system involves making changes to a photo before it is uploaded that is generally imperceptible to human beings, but that will interfere with the facial recognition model.

According to the paper’s abstract, the researchers achieved relatively high success rates at foiling existing facial recognition models.

In this paper, we propose Fawkes, a system that helps individuals inoculate their images against unauthorized facial recognition models. Fawkes achieves this by helping users add imperceptible pixel-level changes (we call them “cloaks”) to their own photos before releasing them. When used to train facial recognition models, these “cloaked” images produce functional models that consistently cause normal images of the user to be misidentified. We experimentally demonstrate that Fawkes provides 95+% protection against user recognition regardless of how trackers train their models. Even when clean, uncloaked images are “leaked” to the tracker and used for training, Fawkes can still maintain an 80+% protection success rate. We achieve 100% success in experiments against today’s state-of-the-art facial recognition services. Finally, we show that Fawkes is robust against a variety of countermeasures that try to detect or disrupt image cloaks.

Leave a Reply

%d bloggers like this: