Facebook ads aren't the problem
Facebook just announced a new set of changes, including a pledge to ban new political ads seven days before the election. The problem is...Facebook ads aren’t the problem, and freezing them is a distraction.
People are beginning to realize how much influence Facebook has over elections, and Zuckerberg is being accused of “setting the rules” for one of the most significant elections in history.
So, in an attempt to slow the spread of fake news and "protect" the election, Zuckerberg has revealed a set of changes that will be implemented, including: a "quiet period" in which it will ban new political ads seven days before the election; strengthened measures that dissuade people from voting, and a method of redirecting users to accurate information on results to quash claims of false victories. The platform will also be introducing a forwarding limit on Messenger -- a move that has been reported successful when it was introduced in WhatsApp back in April.
Unlike other countries, the United States has no election silence law, which means that some TV viewers will be bombarded with ads right up until the end. The difference is, television ads are regulated by the Federal Communications Commission (FCC). Online, anything goes, and Facebook has been accused again and again of allowing politicians to run ads with lies. And this is a problem when the world’s largest social media platform is the subject of an antitrust investigation that will be significantly affected by the election outcome.
Facebook’s highly-personalized, low-cost ad model means that ads seem to have a disproportionate impact per view when compared to other platforms. Facebook’s own research has shown that Facebook ads stick in peoples’ heads significantly faster than other forms of media. And while Facebook has attempted to become more transparent over the years, the platform was not built to help users understand who is targeting them, or why they are being targeted.
And yet, despite all of this, Facebook adverts aren’t the root of Facebook’s misinformation problem.
The problem with Facebook is the fake organic posts and underground conspiracy theory groups being amplified by an algorithm that spews it out into peoples’ newsfeeds. Facebook announced that it is "cracking down" on this again and again -- but whenever one of these groups is shut down, several more seem to pop up in its place.
With millions of people trapped in their houses with little social interaction over the pandemic, these groups, pages and events have become havens for conspiracy theories, unproven cures, and rumors that Hillary Clinton is using her email to arrange child sex rings in pizza shops.
Banning ads won’t stop this. Many are amplified without a single penny spent. And yet, Facebook has stopped short of banning QAnon members, anti-vaxxers, and fake news.
This strategy is a delicate balance between trying to look good in public, and not losing too much money. And at the end of it all, nothing is achieved.
If Facebook really wants to limit misinformation, it should start with organic content.