Pando

What one far-right website's removal reveals about the future of webhost law

After ten years, Wordpress.com decided it was time to kick out The Conservative Treehouse for violating its terms.

By Christopher Hutton , written on December 10, 2020

From The Politics Desk

During the week of November 15th, right-leaning blog The Conservative Treehouse (CTH) received a notification from their web host, WordPress, that their website was being removed. 

The note specifically stated that “given the incompatibility between your site’s content and our terms, you need to find a new hosting provider and must migrate the site by Wednesday, December 2nd.” CTH stated that they would accommodate this order, even if this is just another “big tech control mechanism to shut down speech & assembly.”

 

This removal would be unremarkable if it were not for The Conservative Treehouse's history of misinformation

CTH has a long history of misinformation. It was created in 2011 as a ‘last refuge’ for conservatives adhering to their principles. The Daily Beast previously described it as “Patient Zero for a number of hoaxes that have percolated through right-wing media ecosystem.” The creator of the website, ‘Sundance,’ is an unnamed figure that has only been guessed at by fellow conservative pundits, although some have claimed that the figure is Mark Bradman, an otherwise normal resident from Florida.

The website built its notoriety in the far-right community through spreading false claims about Travyon Martin, a 17-year-old Black man who was shot in 2012 by George Zimmerman after Zimmerman reported Martin for supposed suspicious activity, then pursued and shot Martin 70 meters in front of his house. The shooting would become a point of public controversy, and it led to many people gathering and speaking out on issues of race and guns in the weeks and months after.

During the height of the Martin controversy, CTH promoted claims of Martin being a “undisciplined punk thug, drug dealing, thief and wannabe gangsta”, thus incriminating Martin and justifying Zimmerman’s actions. Martin had no history of drug dealing or gang involvement.

Since then, CTH has played a part in promoting false claims about all sorts of Republican boogeymen. Earlier this year it accused Martin Gugino, a 75-year old man who was shoved by Buffalo officers, of being an ‘antifa provocateur.’ 

 

WordPress did not point out which rule had been breached

When asked about the decision to remove CTH, WordPress gestured toward their user guidelines, stating that the website had violated them. However, WordPress did not point out which specific rule had been breached. The very act of posting misleading or discriminatory information is not condemnatory in itself. PandoDaily reached out to Automattic, the current owner of Wordpress.com and the Wordpress blogging software, for comment but did not receive a response in time for publishing.

This sudden decision by WordPress to remove CTH’s website is surprising, in part because CTH has operated for nearly ten years without any resistance from Automattic. WordPress has previously been in the hot seat due to the posting of conspiracy-mongering accounts on its website. In 2018, the New York Times reported that WordPress was often a home for websites that claimed that the Sandy Hook school shooting of December 2012 was a ‘false flag’ operation funded by the government to promote gun control. Several families filed statements noting the damage done by such content to their lives, but they were ignored by the company. Wordpress told the New York Times in a public comment that “Posting conspiracy theories or untrue content is not banned from WordPress.com, and unfortunately, this is one of those situations.” 

The only cases of web pages that would get pulled down from Wordpress.com in the past required a court order. 

Paul Sieminski, Automattic’s General Counsel, argues that:

“Internet hosts like Automattic…are in no position to judge disputes over the truth of content that we host. Setting aside the marginal number of cases in which it is obvious that content is not defamatory—say, because it expresses an opinion—hosts are not at all equipped to determine whether the content is (or is not) true.” 

But that does not stop others from attempting to control said content. Sieminski claims that Automattic is consistently dealing with defamation cases and strives to use discretion to determine the validity of the case before taking it before a court of law. But he concludes that “leaving such important decisions to the discretion of Internet hosts is misplaced and tilts the balance in favor of silencing often legitimate voices.”

 

Tackling misinformation is far harder than it initially seems 

Jeannette Eicks, a law professor at Vermont Law School and director of the Center for Legal Innovation, notes that there would be little legal repercussions possible under normal circumstances. Section 230 of the Communications Decency Act protects companies like Automattic from legal lawsuits over their users’ posts on their website. However, Eicks believes that the section’s strength as a wide-ranging policy could weaken in the immediate future.

In May, Donald Trump filed an executive order claiming that “Section 230 was not intended to allow a handful of companies to grow into titans controlling vital avenues for our national discourse under the guise of promoting open forums for debate.” While the president’s fixation on this section concerned social networks like Twitter and Facebook, it has inspired several attempts to usurp the section’s power. 

Federal Communications Commission (FCC) General Counsel Thomas M. Johnson Jr. argued that the FCC could interpret all amendments to the Communications Act of 1934, including Section 230. However, a representative of the Electronic Freedom Foundation (EFF), a San Francisco-based digital rights group, informed PandoDaily that this is an inaccurate reading of the FCC’s actual powers. 

The EFF’s legal notes that “courts [interpret Section 230] when intermediaries are sued and invoke Section 230 in their own defense. But The FCC has no regulatory authority over social media platforms.” According to the EFF, the FCC is only a “telecommunications/spectrum regulator”, and “only the communications infrastructure industry are subject to the agency’s regulatory authority.” If such a method of interpretation was attempted by the FCC, then a lawsuit would follow in order to test the actual legal merits of Johnson’s interpretation.

 

Will Section 230 change? Eicks believes so.

While Trump’s loss does weaken conservatives’ potential to pursue a total repeal of the policy, Biden has vocalized support for a complete revoking of Section 230, if not a reform of the process. Biden staffers have also stated that reforming Facebook’s misinformation handling would be a priority in their first few days in office. This sort of policy would require a cooperative Congress, highly dependent on the Georgia Senate races scheduled to wrap up in January. But doing so would have immeasurable consequences. 

In a memo filed to the Biden administration regarding internet policies, the EFF states that “altering the law to force the removal of so-called “disinformation,” to demand the political neutrality of their decisions, or to broaden platform liability for already-unlawful content would have consequences that reach far beyond the intended targets.”

Recent events may also encourage web companies like Automattic to pursue additional motions toward removing such content. While Facebook and Twitter have received further scrutiny for their handling of misinformation in the days after the election, they also appear more emboldened to pursue such ventures where it applies. Congress’s mid-November hearing regarding social media revealed that the Democrats in the Senate desire more censorship, not less. And Automattic, whose size is a mere fraction compared to the social media giants, would undoubtedly get little flack for removing a site like CTH. 

Automattic’s decision reflects an emboldening of web companies to censor or remove content that they find clearly and blatantly harmful. While future changes to Section 230 are not off the table, the legal challenges that may follow will complicate affairs. This is why competition is what is most recommended by legal organizations like EFF. Their representative recommended that if content hosts wish to do better about handling misinformation in a meaningful way, that they adhere to protections like the Manila Principles on Intermediary Liability, or the Santa Clara Principles. Both of these procedures would allow companies to mitigate mistakes to those actually at fault, and allow those who dislike a particular content moderation style to move to other platforms, as CTH did.

It’s unclear if this move will lead to CTH’s misinformation spreading less, but it is a possible start.

 

For more from Pando, sign up to our weekly newsletter.