Tag: harm reduction efforts

  • Harm Reduction Nonprofit Sues Facebook Over Censorship

    Harm Reduction Nonprofit Sues Facebook Over Censorship

    “We are fighting for the rights of all users of the Internet to appeal from social media giants’ decisions,” the nonprofit’s rep told The Fix. 

    A Polish non-profit organization is suing Facebook for allegedly censoring its harm reduction content by deleting groups and pages on the social media platform that were related to helping people who use and are addicted to drugs.

    The Civil Society Drug Policy Initiative (Społeczna Inicjatywa Narkopolityki, or SIN) filed the lawsuit in May and received a favorable ruling by the District Court in Warsaw in June, though Facebook can still appeal. 

    The case is ongoing, but the court made an interim ruling prohibiting the social media company from removing any more fan pages, profiles or groups run by SIN on Facebook or Instagram.

    The ruling also requires Facebook to store backups of the pages, profiles and groups it already deleted so that they can be restored should SIN win the overall suit. Facebook can appeal the ruling, but SIN is encouraged by this result.

    The Bigger Issue

    A recent report by Vice outlined the larger problem of Facebook pages, groups, posts, and ads being deleted and accounts being banned for promoting harm reduction principles and products.

    In one case, the social media manager for a nonprofit organization called BunkPolice was banned from placing any ads on the platform after submitting and getting approval for ads promoting fentanyl testing kits.

    The kits are used to test batches of illicit drugs for the extremely potent opioid, fentanyl, which has been responsible for a large percentage of the recent overdose deaths in the U.S. However, they got caught up in Facebook’s efforts to stop drug trafficking on its platform.

    Fighting Censorship

    In response to this problem, SIN has launched a “#blocked” campaign to speak out against what it considers to be a worrying spread of content control by large social media companies and censorship.

    “Online platforms such as Facebook, YouTube and Twitter increasingly control what you can see and say online. Algorithms follow users’ activity, while filters and moderators address alleged breaches of terms of service,” the campaign website reads. “Unfortunately, there has also been a number of instances when legal and valuable content was removed, including historical photos, war photography, publications documenting police brutality and other human rights’ violations, coverage of social protests, works of art and satire.”

    The NGO also published a corresponding video on YouTube the day after filing its lawsuit against Facebook. The video warns about social media giants having too much control over the content that everyday people see, and cautions that “you too could end up on their blacklist.” For SIN, this goes beyond the goal of harm reduction to freedom of speech rights for all internet users.

    “We are fighting for the rights of all users of the Internet to appeal from social media giants’ decisions,” said SIN representative Jerzy Afanasjew in an email to The Fix.

    View the original article at thefix.com

  • Facebook Users Promoting Harm Reduction Face Bans And Deleted Pages

    Facebook Users Promoting Harm Reduction Face Bans And Deleted Pages

    Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

    A new Facebook campaign to combat the opioid crisis appears to have unintentionally targeted harm reduction efforts on its own social media platform as ads for fentanyl-testing kits result in bans and pages created by harm reduction organizations are deleted.

    A report by Vice found and interviewed multiple individuals who have been targeted by the platform in ways that are hampering their efforts to prevent overdose deaths.

    Facebook recently teamed up with Partnership for Drug-Free Kids for the “Stop Opioid Silence” campaign, but their efforts to fight drug trafficking on the massive social platform looks to have created more opioid-related silence.

    This is causing serious problems for organizations such as the Southside Harm Reduction Services that post warnings on their Facebook pages about local batches of illicit drugs that had been found to contain fentanyl, the extremely potent opioid responsible for many of the overdose cases and deaths in recent years. These posts are being rejected or experiencing “reduced distribution,” meaning that those that do get posted are not being seen by the community.

    In one particularly severe case, the social media manager of BunkPolice, Colin Marcom, was permanently banned from placing any ads on Facebook after he used the platform to advertise BunkPolice’s fentanyl testing kits.

    These simple kits can easily test for fentanyl, which is a tasteless and odorless synthetic opioid easily mixed in heroin, cocaine, ecstasy, and other common illicit drugs.

    “Facebook banned my personal account from ever being able to place ads on Facebook again, b/c of an ad, with this picture, that they approved for $20 & it ran for 7 days,” wrote BunkPolice in a Twitter post. “7 days, no warning – right to suspension – I submitted a sensible appeal, they said I was promoting drug use.”

    While harm reduction efforts like these have been repeatedly found to save lives without increasing drug use, as some people feared, Facebook seems to be treating these efforts like they’re a drug-trafficking scheme. To make matters worse, recent attempts to appeal bans and deleted posts and pages have been rejected. 

    After Vice contacted Facebook for comment, multiple posts from harm reduction pages that were previously flagged and deleted were restored, suggesting that the problem may be automated. It’s also possible that the vague language in their “regulated goods policy,” that allows for posts about drug use “in a recovery context,” was misinterpreted by employees who reviewed the appeals. 

    An extended report published by The Verge earlier this year found that Facebook moderators are chronically overworked, confused by ever-changing policies, and in some cases have been diagnosed with PTSD from viewing so much extremely disturbing content.

    According to the report, these moderators spend less than 30 seconds on an average flagged post before deciding whether to allow or delete.

    Facebook is reportedly “still investigating” cases of entire pages and groups for harm reduction being deleted.

    View the original article at thefix.com