Facebook Users Promoting Harm Reduction Face Bans And Deleted Pages

Facebook Users Promoting Harm Reduction Face Bans And Deleted Pages

Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

A new Facebook campaign to combat the opioid crisis appears to have unintentionally targeted harm reduction efforts on its own social media platform as ads for fentanyl-testing kits result in bans and pages created by harm reduction organizations are deleted.

A report by Vice found and interviewed multiple individuals who have been targeted by the platform in ways that are hampering their efforts to prevent overdose deaths.

Facebook recently teamed up with Partnership for Drug-Free Kids for the “Stop Opioid Silence” campaign, but their efforts to fight drug trafficking on the massive social platform looks to have created more opioid-related silence.

This is causing serious problems for organizations such as the Southside Harm Reduction Services that post warnings on their Facebook pages about local batches of illicit drugs that had been found to contain fentanyl, the extremely potent opioid responsible for many of the overdose cases and deaths in recent years. These posts are being rejected or experiencing “reduced distribution,” meaning that those that do get posted are not being seen by the community.

In one particularly severe case, the social media manager of BunkPolice, Colin Marcom, was permanently banned from placing any ads on Facebook after he used the platform to advertise BunkPolice’s fentanyl testing kits.

These simple kits can easily test for fentanyl, which is a tasteless and odorless synthetic opioid easily mixed in heroin, cocaine, ecstasy, and other common illicit drugs.

“Facebook banned my personal account from ever being able to place ads on Facebook again, b/c of an ad, with this picture, that they approved for $20 & it ran for 7 days,” wrote BunkPolice in a Twitter post. “7 days, no warning – right to suspension – I submitted a sensible appeal, they said I was promoting drug use.”

While harm reduction efforts like these have been repeatedly found to save lives without increasing drug use, as some people feared, Facebook seems to be treating these efforts like they’re a drug-trafficking scheme. To make matters worse, recent attempts to appeal bans and deleted posts and pages have been rejected. 

After Vice contacted Facebook for comment, multiple posts from harm reduction pages that were previously flagged and deleted were restored, suggesting that the problem may be automated. It’s also possible that the vague language in their “regulated goods policy,” that allows for posts about drug use “in a recovery context,” was misinterpreted by employees who reviewed the appeals. 

An extended report published by The Verge earlier this year found that Facebook moderators are chronically overworked, confused by ever-changing policies, and in some cases have been diagnosed with PTSD from viewing so much extremely disturbing content.

According to the report, these moderators spend less than 30 seconds on an average flagged post before deciding whether to allow or delete.

Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

View the original article at thefix.com

By The Fix

The Fix provides an extensive forum for debating relevant issues, allowing a large community the opportunity to express its experiences and opinions on all matters pertinent to addiction and recovery without bias or control from The Fix. Our stated editorial mission - and sole bias - is to destigmatize all forms of addiction and mental health matters, support recovery, and assist toward humane policies and resources.