The community is just way too overly negative
in my opinion to want to go back to it.
You know I don’t want to get sucked into
that downward spiral again.
An anonymous online community has recently
been thrust into the mainstream.
So-called ‘incels’ are men who claim to
be involuntarily celibate as they struggle
to have sex or relationships with women.
Incels gather on anonymous online forums such
as reddit and 4chan.
Misogyny and dehumanizing language is rife
within these communities, with some users
threatening to commit acts of violence.
But recently these words turned into actions.
First Elliot Rodger killed 6 people in Isla
Vista, California in 2014.
He posted a manifesto and numerous videos
online beforehand.
I will punish all of you for it.
In April 2018 Alek Minassian killed 10 people
in Toronto.
Beforehand he posted a message on Facebook
referencing incels and ‘sergent 4chan’.
For a pretty long time I just kind of saw
it as, and even to an extent I still think
this is true, a lot of it is err people using
extremely dark humor and irony they use that
as a way to vent because, for a shock value
like that kind of thing.
You know I would say there’s definitely
a subset of people in the groups who genuinely
believe that stuff.
There’s a pretty big gap between actually
posting about violence and being able to carry
out violence.
When we’re looking at posts online of course
there’s always a measure of humor and irony
and sarcasm that is really hard to account
for.
And unless you are paying a lot of attention
over time to how conversations play out in
certain online forums, it’s very difficult
to know when something is a real threat that
is someone saying, ‘Oh man I hate my job,
I wish everybody there would die.’
Versus you know really saying I work at X,
Y and Z place and tomorrow I plan on doing
X, Y and Z, which is these moments that police
and researchers start to weigh as when does
a threat start to seem more real than just
someone complaining.
When we talk about online radicalization,
it’s really about this process of identification
with these issues that people go through,
where then they get validated and then come
to commit some kind of atrocity.
Some incels come to the community with low
self-esteem or mental health issues - and
are looking for similar people.
It was it was kind of a relief to finally
be able to find a place where you can talk
openly about this stuff without fear of being
judged for it, you know.
It was a place to just be open and talk about
whatever you were feeling.
You didn’t have to worry about the way people
would react to it so.
For me especially when I first found it I
found comfort in that place.
Once in the community they are exposed to
misogyny and hate and a chorus of other anonymous
men confirming how they feel and encouraging
extremist misogynistic ideologies.
You find people that are like yourself.
You tend to navigate towards people that have
similar interests, that are suffering from
similar pains as you and there is a good space
for you to talk it out.
And what becomes dangerous in these spaces
is that there is no way to flag or moderate
that these conversations have turned toxic.
And when people are very vulnerable and they’re
getting responded to by people who are trying
to instrumentalize them, or radicalize them
towards specific actions that’s when things
get very dangerous.
So it isn’t just that they’re going and
being comforted by the conversation, there
are other people acting as recruiters in these
spaces.
It’s through this engagement process that
people become more and more accepting of violence.
The main subreddit for incels was banned in
November 2017 for inciting violence against
women.
Since then other related incel subreddits
have also been banned, but one large subreddit,
Braincels, with over 40,000 subscribers, is
still active.
Other sites such as Incels.me have sprung
up using web domains where they can make their
own rules.
When it comes to drawing a line as to what
people can and can not post these are really
up to the terms of service of platform companies.
However, where these communities tend to really
expand online are in spaces where there is
little to no moderation.
Unfortunately the way the internet is designed
it’s very decentralized and therefore there’s
lots of unregulated and unmoderated spaces
where we find the most hate and misogyny and
extremism.
In the modern world where disaffected people
have access to internet forums and anonymity
online, the question remains of how online
radicalization and violent misogyny can be
prevented.
