Despite years of strict moderation from the main social networks, the “incel” community remains as influential as it was in 2014, when an English 22-year-old killed seven people on the streets of Isla Vista, California, motivated by his hatred of women.
The murders were an eerie parallel of the shootings in Plymouth last week. Both killers were radicalised on social media, where they posted extensively about their hatred of women and their feelings of despair over their lack of sexual activity.
But in the years since 2014, all the main social networks have acted against the movement. Reddit, which was once home to some of the largest incel communities on the internet, has spent much of the past two years enforcing policies that had previously been only loosely applied.
Subreddits such as r/incels and r/theblackpill have been banned for violating “sitewide rules regarding violent content”. The latter was a gathering point for individuals who described themselves as having been “blackpilled”, a philosophy loosely linked to the incel community where members describe themselves as having been awakened to the true miseries of modern life.
In other communities that could easily cross the line into violent extremism, volunteer moderators work hard to keep the conversation from veering into dark places. The Forever Alone subreddit, for instance, is “a place where people who have been alone most of their lives could come and talk about their issues”. Its 10 volunteer moderators do not work for Reddit, but enforce a set of rules, which include “be polite, friendly and welcoming”, and a strict ban on “any incel references, slang or inference”.
The Reddit account of the Plymouth shooter was suspended on Wednesday, just hours before the attack, again for breaking the site’s content policy. A Reddit spokesperson said: “We take these matters very seriously. Our investigation is ongoing.”
Other platforms were slower to act. YouTube, where the shooter had an account and regularly posted vlog-style videos, also took down his account – on Saturday, citing the platform’s “offline behaviour” policy. That policy is also relatively new: as recently as 2019, YouTube was criticised for not taking down content from users such as Tommy Robinson, who were careful to only post videos that were within the rules of the platform, even as they more broadly engaged in behaviour that went far beyond what the service would allow.
“Our hearts go out to those affected by this terrible incident,” a YouTube spokesperson said. “We have strict policies to ensure our platform is not used to incite violence. In addition, we also have longstanding policies that prohibit those responsible for attacks like these from having a YouTube channel and have since terminated their channel from our platform.”
On Facebook, the incel movement isn’t banned outright. Only a small handful of designated “hateful ideologies” are so limited, including white supremacy and nazism. Many more movements are banned as designated “hateful organisations”, but such a restriction does not apply to the leaderless incel movement. Instead, however, the site’s limitations on hate speech largely apply: content promoting hate on the basis of someone’s sex or gender is banned, as is any content promoting violence.
Despite action from large social networks, the incel community remains influential online. Sites with loose or nonexistent moderation policies, such as 4chan and 8kun, have sizeable cohorts, and smaller, dedicated forums are able to set their own moderation policies.
To see more documents/articles regarding this group/organization/subject click here.