On Monday, a federal judge ruled that a class action lawsuit against Facebook can move forward, paving the way for what could turn out to be a costly legal battle for the company.
As Reuters reports, the lawsuit alleges that Facebook improperly collected and stored users’ biometric data. It was originally filed in 2015 by Facebook users in Illinois, which passed the Biometric Information Privacy Act (BIPA) in 2008. The law regulates the collection and storage of biometric data, and requires that a company receive an individual’s consent before it obtains their information.
According to the lawsuit, Facebook ran afoul of BIPA when it began using a tool called Tag Suggestions, which was originally rolled out in 2011. Like many Facebook features, it’s designed to make your user experience better while also providing the company with your data—in this case, very specific facial features.
Here’s Facebook’s explanation of Tag Suggestions, from a 2010 blog post:
When you or a friend upload new photos, we use face recognition software—similar to that found in many photo editing tools—to match your new photos to other photos you're tagged in. We group similar photos together and, whenever possible, suggest the name of the friend in the photos.
There are a few questions at play here: How does Tag Suggestions actually work? What is the company doing with all that data? Who has access to it? Where is it stored? What kind of algorithmic sorcery is it using?
But like many things at Facebook, it’s difficult to find concrete answers. Here’s how the tool works its magic, according to one Help Center post:
Our technology analyzes the pixels in photos and videos, such as your profile picture and photos and videos that you’ve been tagged in, to calculate a unique number, which we call a template. We compare other photos and videos on Facebook to this template and if we find a match we’ll recognize you. If you are untagged from a photo, or video, information from those untagged photos and videos is no longer used in the template. If your face recognition setting is set to off, we delete the template.
The other questions are trickier. As Bloomberg points out, Facebook has also developed a program called DeepFace, which it claimed in 2014 is terrifyingly accurate at correctly identifying what it called “Labelled Faces in the Wild.” It noted that in training the program, researchers used “the largest facial dataset to-date, an identity labeled dataset of four million facial images belonging to more than 4,000 identities.” There’s little out there directly tying a tool like Tag Suggestions to DeepFace, but the overlap is jarring.
Meanwhile, the company remains steadfast in its belief that, when it comes to the class action lawsuit, it has done nothing wrong.
“We are reviewing the ruling,” a Facebook spokesperson wrote in an email to Motherboard. “We continue to believe the case has no merit and will defend ourselves vigorously.” The spokesperson also argued that the company has been open about how the feature works, as well as how it can be turned off. (Regardless of any alleged openness, it’s unclear whether telling users a feature exists and how they can opt out satisfies BIPA’s requirement for explicit consent.)
Facebook has tried several times to quash the lawsuit. Its most recent appeal—in which it argued the plaintiffs failed to prove they had been harmed by the company’s biometric data collection—was denied in February.
As several outlets have noted, the potential financial consequences for Facebook are steep. A class action designation opens the lawsuit up to far more people—more than six million, according to lawyers for the plaintiffs.
The ruling came down on the same day that Facebook published yet another explanation about what it does with your data. Menacingly titled “Hard Questions: What Data Does Facebook Collect When I’m Not Using Facebook, and Why?” the post admitted that it occasionally receives information related to people even when they’re not logged in, or if they don’t have Facebook at all.
At least Zuck’s security is a priority.