facebook-taking-over

Much has been made of Facebook’s user manipulation experiments which involved showing some users more positive stories and others more negative stories, to see if that affected their moods. The experiment “worked:” When Facebook’s News Feed algorithm served up more negative stories, people themselves posted more negative content and vice versa.

Many were rightly upset that Facebook would treat its users as lab rats, particularly without their knowledge. Others were less upset and more disturbed by the emotional power wielded by a single algorithm. So it’s with a bit of social network schadenfreude that Wired’s Mat Honan turns the tables by running his own manipulation experiment on Facebook.

For 48 hours, Honan “liked” virtually everything he saw on Facebook to see how its News Feed algorithm would react. Unlike Twitter, Facebook doesn’t provide a raw feed of your friends and follows. When its algorithm isn’t being manipulated for unethical social science experiments, it responds to user behaviors, particularly “likes.” If you like a lot of stories shared by your old childhood friend, or Walmart, then expect your News Feed to be dominated by your friend’s, or Walmart’s, updates — though certainly Facebook, with its ad revenue cash cow to feed, has a far stronger incentive to boost Walmart’s content than your friend’s.

And that’s exactly what happened when Honan “liked” everything: brands took over almost entirely, and it only took about an hour:

“After checking in and liking a bunch of stuff over the course of an hour, there were no human beings in my feed anymore,” he writes. “It became about brands and messaging, rather than humans with messages.”

Of course, Honan and Facebook aren’t exactly on equal ethical footing when it comes to experimenting on one another. For example, while Honan is willing to “like” a photo of a friend’s bruised baby, not to mention countless brands for which he has little love (“I liked Kohl’s for you”), there’s an ethical line he won’t cross, as he refuses to “like” a friend’s status update about the death of a relative. No such line existed when Facebook experimented on users.

Meanwhile, the cofounder of OkCupid, another company that got heat for user manipulation, laughed off the notion of hiring an ethicist, asking, “[What,] to wring his hands all day for a hundred thousand dollars a year?”

Not only did Honan ruin his own feed, he ruined many of his friends’ feeds, filling them with “Mat Honan likes this” or “Mat Honan likes that” updates. And what about the “news” Honan received? After all, Zuckerberg wants Facebook to be one big newspaper ha ha. Unsurprisingly, it was all Upworthified share-bait with little to no substance.

By “liking” virtually everything, and therefore taking his own tastes and preferences out of the equation, Honan has revealed the logical conclusion of Facebook’s algorithm, and Facebook itself: A wasteland of branded content designed not to enlighten but to make you buy and click and share.

When Facebook took heat for manipulating users into feeling negative emotions, it proclaimed, “No! No! We did it so we know how to make users happier, not sadder.” Now I’m starting to believe them — the more users “like,” the more their News Feed becomes a canvas for advertisers. Maybe feeling negative isn’t so bad after all.

[illustration by Brad Jonas]