Pando

The Facebook facial recognition scandal bubbles back up in court

But is it still a scandal at all?

By Dan Raile , written on August 25, 2015

From The Surveillance Valley Desk

The following quotation comes from a field report in Ray Bradbury’s FBI file, dated 1959:

Informant declared that a number of science fiction writers have created illusions with regard to the impossibility of continuing world affairs in an organized manner now or in the future through the medium of futuristic stories concerned with the potentialities of science.

“Informant stated that the general aim of these science fiction writers is to frighten the people into a state of paralysis or psychological incompetence bordering on hysteria which would make it very possible to conduct a Third World War in [sic] which the American people would seriously believe could not be won since their morale had been seriously destroyed.

A deranged but beautiful ode to the power of pulp fiction, filed away and forgotten in the banks of the war-hungry, paranoid machine. Ultimately the Bureau determined it had nothing actionable to do with Ray Bradbury.

Anyway, it seems somehow that everyone in this little vignette has been vindicated: The Informant gives a fairly good prediction of the upcoming war in Southeast Asia; the Agent and his masters and the machine they served never really lost their grip on “continuing world affairs in an organized manner”, and the many of illusions spun by sci-fi writers have come to life. To some degree they’re still frightening, but the edge is coming off.

One science that has escaped from fiction into the world, and is currently in full fruit: machine facial recognition.

In recent years it has become both highly accurate (“superhuman” by some estimations) and widely used and available – face recognition being an especially well-developed application of innovations in deep learning and neural nets. It’s finally as banal as billboard advertising.

Whether society is ready or not, it’s here. Facebook,Google, and other Web 2.0 giants have an edge on the field, since they can feed their learning machines the most nutritious data sets (and since they’ve gobbled up the biggest names in machine learning.) But they’re far from alone. These days one doesn’t need a private data trove or world-renowned PhD’s to start recognizing people the machine way.

The present situation pretty well bears out a 2011 study from Carnegie Mellon which proved that publicly available image sets coupled with off-the-shelf facial recognition software was sufficient not only to identify people on the street but also to unearth much of their sensitive information with further statistical techniques. The authors, who were able to infer Social Security Numbers from facial images through a phone app they developed, concluded:

However, considering the technological trends in cloud computing, face recognition accuracy, and online self-disclosures, it is hard not to conclude that what today we presented as a proof-of-concept in our study, tomorrow may become as common as everyday's text-based search engine queries.

And here we are.

It’s still just as easy imagine situations in which strong, ubiquitous facial recognition software leads into serious ethical concerns. And there is still just as little recourse for those made uncomfortable by the innovation.

It was Facebook who first tore the bandaid off the social norms around commercial facial recognition software, when in 2009 it deployed the Photo Tagger feature as a default setting. By and by, the din of discomfort grew: Al Franken subjected the company to a Senate Judiciary subcommittee hearing, European regulators stood firm in their opprobrium, and Facebook eventually pulled the product from its network. Until it put it back in, quietly, in 2013. Where it lives today. To paraphrase Mark Zuckerberg: the norm evolved.

Before any of this happened, back in 2008, the Illinois legislature passed the Biometric Information Privacy Act. The Act unambiguously states that private companies cannot gather biometric information including faceprint on residents of Illinois without first informing them of the fact, purpose and duration of this collection; securing their informed consent; and publishing a written statement of the company biometric identification policy. The law's thorniest provisions requires companies to destroy biometric identifiers within three years of a user’s last login. Texas subsequently passed a similar law.

Last week, a class action suit challenging Facebook’s Photo Tagger under the Illinois law entered U.S. District Court in San Francisco. The Biometric IPA, and the class-action suit stemming from it, seem fairly straightforward. Facebook didn’t get user consent before launching Photo Tagger, and the law states that, unless it does so, “No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person's or a customer's biometric identifier or biometric information.”

The suit was filed this spring in Illinois, and Facebook got it removed to San Francisco in July. A case management hearing has been set for mid-November.

Facebook has yet to respond directly to the complaint, and didn’t respond to requests for comment. Somehow, though the Illinois law has been in effect since 2008, and was in place during all that Photo Tagger controversy a few years back, this seems to be the first case to test it.

“We’ve never heard of another challenge,” said the man who answered the phone at the ACLU press desk in Chicago yesterday.

Though it comes late, the suit could trip up the sweep of commercial facial recognition software into the mainstream of the web, enough to prompt Facebook, et al, to accept its need to obtain explicit consent, rather than miss out on the biometric data of the entire populations of Chicago and Houston.

“Unless these cases fail outright, I suspect this law is going to do the same thing for biometric information that the California Online Privacy Act has done for Internet privacy more generally. Every company will have to comply no matter where they are located,” said Aaron Williamson, a partner at Tor Eklund P.C., who advises software companies on their privacy policies. Williamson noted that a similar case has been lodged against Shutterfly in Illinois, and said he expected more to come. “I expect these cases will settle.”

This prospect was on the table in the course of an 18-month-long series of meetings organized by the Department of Commerce to get technology companies and privacy advocates to agree to a voluntary code of facial recognition conduct. That process seems to have come to an end in June when all the privacy advocates threw their hands up and walked away.  

In the end users everywhere may win the right to click a button to agree. But other concerns about biometric recognition could linger.

As the CMU researchers demonstrate, publicly available imagery provides enough fodder for startlingly accurate identification.

For another thing, some large American police departments have recently been outfitted with mobile facial recognition devices, as the New York Times reported. The FBI is determined to implement its Next Generation Identification program across the country, and with the simultaneous push to put body cameras on every officer, how soon will the biometrics (faceprint, pulse, gestures, voiceprint) of every person a cop encounters be entered into the database and sorted? Do people want that?

Both the Illinois and Texas law exempt themselves from providing any guidance as to how government actors (or their contractors) use biometric information. President Obama proposed legislation this spring, which has gone nowhere.

Even if all the legislatures in the world adopted a law similar to that in Illinois, and forced opt-in policies for commercial use of facial recognition, our privacy wouldn’t necessarily be that much better off.

“Science is a glove, so much depends on what kind of hand is put into it,” Ray Bradbury said, quoted in J. Edgar Hoover’s warren of files.

Ultimately with facial recognition, widespread ambivalence may outpace the lawmakers, as people learn to enjoy putting the glove on themselves.

Last week in San Francisco, Intel showed off its new Realsense Camera F200, which in the months ahead will begin appearing in consumer devices. In a press release for, Intel announces to prospective developers:

Featuring full 1080p color and a best-in-class depth sensor, the camera gives PCs and tablets 3D vision for new, immersive experiences. Interact more intuitively with facial analysis, hand and finger tracking, speech recognition, background subtraction, and augmented reality, for agile device control...The visual data and motion-tracking software, [sic] create a touch-free interface that responds to hand, arm, and head motions, and facial expressions.

Sweet!

Maybe facial recognition is just what the public wants in its Christmas stockings. Maybe it’s the API and not the FBI that’s trying to pin something on us, if only our shopper profiles, woven in the corporate neural floss, and passed from hand to hand, ad to ad, as we navigate the mall. And J. Edgar Hoover rolls in his grave beneath the new era of toys he could only dream of.

“There's no reason to burn books if you don't read them,” an 80 year old Ray Bradbury told the Peoria Journal Star in 2000.