Pando

Facebook hired an "empathy team" to humanize its workers. I have a better solution.

By Nathaniel Mott , written on December 12, 2014

From The News Desk

Facebook is finally starting to realize the 1.2 billion people who use its service to vent about their work, communicate with their families, or share pictures with their friends are human beings.

Business Insider reports that Facebook has stopped referring to the people who use its service as "users" when it's having internal discussions. The company has also hired an "empathy team" to help workers "understand what it's actually like to be a user, or a business paying for advertising."

The news comes after it was revealed that Facebook manipulated some people's News Feeds to determine the effect it might have on their emotions -- without their informed consent, or even a notification telling them they were participating in a psychological experiment -- for "research."

As I wrote when people outside Facebook started defending the company's experiment in June:

Facebook just doesn’t give a shit about the people who use its service and is more than willing to experiment with their mental health because of the culture of contempt that it has created within the confines of One Hacker Way. It knows that there are people on the other end of those algorithms — it just doesn’t care. It exists to make people spend more time online, not to worry about how other people feel.
Apparently it requires a dedicated "empathy team" for Facebook's employees to break free from that perception. Or does it? There might be a better solution: use the algorithms Facebook loves so much to help its employees understand that their actions affect more than a billion people.

The company's already working on a tool meant to help stop its users from sharing embarrassing pictures on the service. The company's head of artificial intelligence explained the idea to Wired:

Let’s say you’re out drinking with your buddies, things get out of hand, you pull out your smartphone, you take a selfie in the middle of all this drunken revelry, then you take 30 or 40 more, and, without hesitation, you start uploading them to Facebook.

It’s a common thing to do. But Yann LeCun aims to stop such unbridled behavior—or at least warn people when they’re about to do something they might regret. He wants to build a kind of Facebook digital assistant that will, say, recognize when you’re uploading an embarrassingly candid photo of your late-night antics. In a virtual way, he explains, this assistant would tap you on the shoulder and say: 'Uh, this is being posted publicly. Are you sure you want your boss and your mother to see this?' A similar assistant could tap Facebook employees on their shoulders and say: "Uh, this thing you're doing has ramifications for people outside this company. Are you sure you want to act like living, breathing human beings are little more than bundles of data manipulated for your own amusement?" Surely that would be more efficient than having an entire "empathy team."

[illustration by Brad Jonas]