Facebook Treats Punk Rockers Like Crazy Conspiracy Theorists, Kicks Them Offline

Facebook announced last year that it would be banning followers of QAnon, the conspiracy theorists that allege that a cabal of satanic pedophiles is plotting against former U.S. president Donald Trump.

It seemed like a case of good riddance to bad rubbish.

Members of an Oakland-basedpunk rock band called Adrenochromewere taken completely by surprise when Facebook disabled their band page, along with all three of their personal accounts, as well as a page for a booking business run by the bands singer, Gina Marie, and drummer Brianne.

Marie had no reason to think that Facebooks content moderation battle with QAnon would affect her. The strange word (which refers tooxidized adrenaline) was popularized by Hunter Thompson in two books from the 1970s. Marie and her bandmates, who didnt even know about QAnon when they named their band years ago, picked the name as a shout-out to asongby a British band from the 80s, Sisters of Mercy. They were surprised as anyone that in the past few years, QAnon followers copied Hunter Thompsons (fictional) idea that adrenochrome is an intoxicating substance, and gave this obscure chemical acentral placein their ideology.

The four Adrenochrome band members had nothing to do with the QAnon conspiracy theory and didnt discuss it online, other than receiving occasional (unsolicited and unwanted) Facebook messages from QAnon followers confused about their band name.

But on Jan. 29, without warning, Facebook shut down not just the Adrenochrome band page, but the personal pages of the three band members who had Facebook accounts, including Marie, and the page for the booking business.

I had 2,300 friends on Facebook, a lot of people Id met on tour, Marie said. Some of these people I dont know how to reach anymore. I had wedding photos, and baby photos, that I didnt have copies of anywhere else.

False Positives

The QAnon conspiracy theory became bafflingly widespread. Any website hostwhether its comments on the tiniest blog, or a big social media siteis within its rights to moderate that QAnon-related content and the users who spread it. Can Facebook really be blamed for catching a few innocent users in the net that it aimed at QAnon?

Yes, actually, it can. We know that content moderation, at scale,is impossible to do perfectly. Thats why we advocate companies following theSanta Clara Principles: a short list of best practices, that include numbers (publish them), notice (provide it to users in a meaningful way), and appeal (a fair path to human review).

Facebook didnt give Marie and her bandmates any reason that her page went down, leaving them to just assume it was related to their bands name. They also didnt provide any mechanism at all for appeal. All she got was a notice (screenshot below) telling her that her account was disabled, and that it would be deleted permanently within 30 days. The screenshot said if you think your account was disabled by mistake, you can submit more information via the Help Center. But Marie wasnt able to even log in to the Help Center to provide this information.

Ultimately, Marie reached out to EFF and Facebook restored her account on February 16, after we appealed to them directly. But then, within hours, Facebook disabled it again. On February 28, after we again asked Facebook to restore her account, it was restored.

We asked Facebook why the account went down, and they said only that these users were impacted by a false positive for harmful conspiracy theories. That was the first time Marie had been given any reason for losing access to her friends and photos.

That should have been the end of it, but on March 5 Maries account was disabled for a third time. She was sent the exact same message, with no option to appeal. Once more we intervened, and got her account backwe hope, this time, for good.

This isnt a happy ending. First, users shouldnt have to reach out to an advocacy group in the first place to get help in challenging a basic account disabling. One hand wasnt talking to the other, and Facebook couldnt seem to stop this wrongful account termination.

Second, Facebook still hasnt provided any meaningful ability to appealor even any real notice, something they explicitly promised to provide inour 2019 Who Has Your Back? report.

Facebook is the largestsocial network online. They have the resources to set the standard for content moderation, but they're not even doing the basics.Following theSanta Clara PrinciplesNumbers, Notice, and Appealwould be a good start.