San FranciscoThe Electronic Frontier Foundation (EFF) today launched TOSsed Out, a project to highlight the vast spectrum of people silenced by social media platforms that inconsistently and erroneously apply terms of
TOSsed Out will track and publicize the ways in which TOS and other speech moderation rules are unevenly enforced, with little to no transparency, against a range people for whom the Internet is an irreplaceable forum to express ideas, connect with others, and find support.
This includes people on the margins who question authority, criticize the powerful, educate, and call attention to discrimination. The project is a continuation of work EFF began five years ago when it launched Onlinecensorship.org to collect speech takedown reports from users.
Last week the White House launched a tool to report take downs, following the presidents repeated allegations that conservatives are being censored on social media, said Jillian York, EFF Director for International Freedom of Expression. But in reality, commercial content moderation practices negatively affect all kinds of people with all kinds of political views. Black women get flagged for posting hate speech when they share experiences of racism. Sex educators content is removed because it was deemed too risqu. TOSsed Out will show that trying to censor social media at scale ends up removing far too much legal, protected speech that should be allowed on platforms.
EFF conceived TOSsed Out in late 2018 after seeing more takedowns resulting from increased public and government pressure to deal with objectionable content, as well as the rise in automated tools. While calls for censorship abound, TOSsed Out aims to demonstrate how difficult it is for platforms to get it right. Platform ruleseither through automation or human moderatorsunfairly ban many people who dont deserve it and disproportionately impact those with insufficient resources to easily move to other mediums to speak out, express their ideas, and build a community.
EFF is launching TOSsed Out with several examples of TOS enforcement gone wrong, and invites visitors to the site to submit more. In one example, a reverend couldnt initially promote a Black Lives Matter-themed concert on Facebook, eventually discovering that using the words Black Lives Matter required additional review. Other examples include queer sex education videos being removed and automated filters on Tumblr flagging a law professors black and white drawings of design patents as adult content. Political speech is also impacted; one case highlights the removal of a parody account lampooning presidential candidate Beto ORourke.
The current debates and complaints too often center on people with huge followings getting kicked off of social media because of their political ideologies. This threatens to miss the bigger problem. TOS enforcement by corporate gatekeepers far more often hits people without the resources and networks to fight back to regain their voice online, said EFF Policy Analyst Katharine Trendacosta. Platforms over-filter in response to pressure to weed out objectionable content, and a broad range of people at the margins are paying the price. With TOSsed Out, we seek to put pressure on those platforms to take a closer look at who is being actually hurt by their speech moderation rules, instead of just responding to the headline of the day.