Pando

Facebook thinks you're too dumb to realize its scientific papers are really just PR

By David Holmes , written on May 8, 2015

From The News Desk

Yesterday, Technology Review was one of many outlets that published a headline, or variation thereof, reading, "Facebook Says You Filter News More Than Its Algorithm Does." This revelation was lifted from a paper Facebook's researchers published the same day in the more-or-less reputable scientific journal Science. The study sought to determine whether the social network's News Feed algorithm -- in aiming to serve up stories it predicts users will want to read -- limits people's exposure to political viewpoints they disagree with, creating a ideological bubble and contributing to political polarization. It's an important question, particularly ahead of the 2016 presidential election and in light of Facebook's growing influence over how news reaches audiences. According to recent studies, almost half of all web-using adults -- and 88 percent of Millennials -- use Facebook to find news.

But this research paper is less a piece of objective scientific inquiry and more the work of corporate-commissioned data tricksters -- a rancid pile of pro-Facebook propaganda that derives and frames its conclusions with the sole purpose of making Facebook look good.

This isn't science. It's PR.

And because press releases -- even ones with funky algebra and annotations -- never reflect poorly on a company, it's not hard to predict what this Facebook-commissioned study, carried out by Facebook researchers, concluded when investigating whether Facebook's algorithms contribute to political polarization.

Indeed, the paper found that the News Feed algorithm had but a minuscule effect on limiting users' exposure to viewpoints different from their own. The real perpetrator of political echo chambers on Facebook, the researchers stated, were users themselves, who usually fail to click on stories they disagree with and whose friends are predominantly like-minded politically. So if you're a liberal, and when you log into Facebook all you see are stories that preach to your bleeding heart choir, it's not Facebook's fault -- it's your own for being too closed-minded and for not befriending enough conservatives.

If true, that conclusion holds great significance. More and more, algorithms are responsible for the media we consume -- whether it's the stories served up by Facebook or the films Netflix recommends -- and for the products we buy, as companies like Amazon and Google strive to know what users want before they know it themselves. From Mark Zuckerberg's high-minded praise of journalistic institutions to the company's grand ambitions to directly host content supplied by newspapers and magazines, Facebook endeavors to become the dominant platform and filter for news. And it's therefore crucial to keep Facebook accountable for providing factual and balanced streams of information, just as if it were the New York Times or NPR.

If we believe the study's conclusions, along with many of the aggregated news stories that quickly summarized it, it would appear that Facebook is a responsible steward of the world's news, at least in one respect. But the fact that Facebook is the one absolving itself of damaging public discourse, demands greater scrutiny. And on closer examination, much of the data and assumptions that informed the researchers' work fail to qualify as sound scientific inquiry. Moreover, the study is part of a larger trend of Facebook blaming users whenever it's the target of criticism.

First, let's look at the hard data Facebook unearthed -- which, admittedly, is in pretty short supply. Researchers found that after bringing its News Feed algorithm to bear on news stories posted by friends and pages, conservatives are 5 percent less likely to see stories they disagree with, while liberals are 8 percent less likely. That's not a huge differential, and is perhaps smaller than we might expect. Nevertheless, the data undoubtedly shows that Facebook's algorithms do limit users' exposure to opposing viewpoints, or as Facebook calls them in the paper, "cross-cutting" stories.

But the researchers all but dismiss this inescapable fact merely because users contribute to these echo chambers as well. The friends of users who list their political affiliation as "liberal," for example, only share conservative-leaning stories 24 percent of the time. The friends of users who list their political affiliation as "conservative," meanwhile, only share liberal-leaning stories 35 percent of the time. Making matters worse, liberals only click on cross-cutting stories 7 percent of the time, while conservatives do so 17 percent of the time. So even without the influence of Facebook's algorithmic twitches, the behavior of liberal and conservative users is responsible for a good deal of polarization.

Even if we take the study at face value, that doesn't exonerate Facebook simply because user choices are a factor in limiting one's exposure to diverse political views. Its algorithms do the same thing, although the effect may not be as strong. But this disparity between the impact of users and the impact of algorithms on polarization is extremely suspect. That's because the study fails at one of the most basic requirements of strong science: Establishing a test group that is representative of the whole. Communications professor Christian Sandvig, writing at Microsoft Research's Social Media Collective blog, notes that researchers only evaluated feeds belonging to users who share a discernible political affiliation on their Facebook profile -- which the professor estimates only makes up 4 percent of users. And because that's such a specific behavioral trait, the test group is hardly what one would call "representative" of Facebook's larger user base. With that in mind, this data -- whether it's interpreted in Facebook's favor or against it -- doesn't tell us much of anything at all.

The researchers also betray their motives by adopting a tone that is self-serving and even defensive, suggesting that in spite of whatever limiting effects Facebook's algorithm brings to bear on cross-cutting content, it shouldn't matter because social media users are ultimately "exposed to more cross-cutting discourse in social media they would be under the digital reality envisioned by some," before linking to a book that includes stern warnings on how hyper-personalization may threaten democracy -- a book written in 2001 before Facebook even existed.

"Perhaps this could be a new Facebook motto used in advertising," Sandvig writes. "'Facebook: Better than one speculative dystopian future!'"

So a corporation smuggled some PR into a scientific paper and some journalists fell for it. Somewhere, an angel got his wings and a publicist got a promotion.

But there's nothing trivial about Facebook's efforts to manipulate the narrative around its value to news consumer. The company has made clear its ambitions to become the predominant source for news content and a gatekeeper of information that rivals Google in its power over what we know. And even if its algorithms play only a negligible role in contributing to echo chambers and polarization, there's an argument to be made that Facebook, which has never been shy in the past about policing what content users see, has a responsibility as a major news distributor to use its algorithm to counteract the effects of users' self-made bubbles, placing more weight on cross-cutting stories in News Feeds belonging to users with stated political affiliations. Supporters of completely free and open content networks may bristle at that suggestion, but Facebook already exerts a great deal of control over what users see, as it constantly tweaks and tinkers with its News Feed algorithm. And if it must continue to mold and alter the shape of our News Feeds, then perhaps it could do so in ways that better its users as news consumers, deemphasizing or even exorcizing false or plagiarized stories while promoting a measure of ideological balance.

But anyone with a knowledge of Facebook's brief history in public relations knows exactly how the company would respond to this suggestion. It would raise the same defense it always does when critics foolishly attempt to hold Facebook to standards of journalistic ethics: By claiming it's "user-first."

As recently as last month, Facebook played the "user-first" card in response to criticism over changes it made to its News Feed algorithm that deemphasized content shared by pages belonging to brands -- including news organizations. It doesn't take a wizard of business to know why Facebook would make this change. As engagement and referrals surrounding this content inevitably falls, news organizations that have become reliant on Facebook for traffic will need that fix and do whatever's necessary to get it. Chiefly, that means running paid promotions on Facebook to ensure users see these posts -- promotions that make Facebook even richer.

But there's a darker side to this: Changes like this create a "pay-to-play" paradigm wherein only the most well-funded news organizations are afforded the enormous reach Facebook can offer. This state of affairs tends to crowd out smaller outlets -- many of whom are smaller because they value journalistic bravery over brand-friendliness -- which is as harmful to public discourse as the political polarization Facebook examined in yesterday's study. And just like it did in its study on polarization, Facebook blamed users for the controversial News Feed change, declaring that its data team had crunched the numbers and found that users liked it better when they saw fewer posts from news sites. Of course, by crunching the numbers, Facebook merely meant that it ran a survey, and the questions it asked of users were magnificently leading. Questions like, “Are you worried about missing important updates from the friends you care about?” seemed to be carefully designed to goad users into saying they preferred an outcome that just so happened to be perfectly aligned with Facebook's business interests.

So of course, everyone's going to answer yes to that. But does that mean users don't care about other types of posts? It doesn't matter. This is what Facebook does, and this is what it just did with its latest scientific paper. Whenever it receives criticism or wishes to carry out something potentially controversial, it saves face by taking messy or incomplete or subjective data and twisting it so that users shoulder the blame.

And therein lies the insidious brilliance of the News Feed algorithm. Contrary to what its polarization study suggests, Facebook's algorithms and the behavior of its users are not two separate and discrete forces. The work done by the algorithm is heavily informed by user behavior, but not entirely. And to what degree and under what circumstances user behavior holds sway is not always clear. This makes any comparison between the two highly muddled and confused -- which is just how Facebook likes it. That way, the company can attribute virtually any negative consequence of any algorithm-driven efforts to maximize profit or influence to the behavioral whims of its users. The News Feed algorithm is at once a black box and a magic wand.

Facebook wants to be taken seriously by journalistic organizations as part of its play to host news content directly. But whenever critics raise concerns about changes made to its algorithms which, incidentally or purposefully, limit users' exposure to certain types of news content, Facebook casts off its responsibility to any higher ideals of journalistic integrity. Thanks to the knot of human and algorithmic influences that impact the News Feed, which by design are impossible for outsiders to untangle, it's able to argue -- with science! -- that it's beholden only to users and the "user experience." If politically polarized users want politically polarized content, Facebook won't disappoint them.

But this user-first defense is disingenuousness, particularly in light of Facebook's broader monetization strategies and its ambitions to control how the news media reaches audiences. It's old hat to say that if you're not paying for a product, the product is you. But when it comes to Facebook, few cliches hold more true. The company's core constituencies are advertisers and other brands that pay for exposure on its platform, and many of Facebook's product changes --like its most recent News Feed tweak which encourages organizations to pay or partner with the company to reach users -- are aligned with its endeavors to boost its revenue and influence and not, as Facebook innocently claims, to create a "better user experience."

In what's becoming an enormously troubling trend among corporations, Facebook will only continue to use slippery data as a weapon in its war with the public and its competitors over the company's public image. The Internet has made it possible to fact-check anything, and laypeople have become more adept than ever at identifying spin and other deceitful rhetorical techniques native to public relations. That's why it's so brilliant and insidious to see Facebook farm out what's traditionally the work of PR specialists to data scientists. On top of the intellectual cachet society affords them as Silicon Valley geek-idols, these whiz kids breathe the rarefied air of academics, appearing in Science which, despite a reputation that's waned a bit -- probably owing to its willingness to give lousy corporate data research like a pass -- is still exponentially more credible than a rewritten press release at Techcrunch. Most readers won't think twice, nor will many tech journalists who will blindly rephrase studies like these without really examining the quality of the data or the subjective assumptions put forth by the authors.

But don't be fooled by the fancy diagrams and annotations. This isn't science. It's pro-Facebook propaganda.

[illustration by Brad Jonas]