JULIA EBNER: Hi, everyone.
Yeah, I'm really glad to
be here at Google today.
I think this is
an excellent place
to present the findings
of the book, not least
because I think Google
also has an important role
to play in countering
some of the threats
that we have seen arising on
all sides of the extremist
spectrum, from jihadist
online radicalization
campaigns to white nationalist
and neo-Nazi campaigns.
I think this is something
that I find really important.
And I do find the role that
Google has been playing,
but also hopefully will
play in the future,
something I will also
probably highlight
throughout the presentation,
where I do see potential.
My goal initially for this
book was to really get insights
into the human dimensions.
Especially when talking about
online extremist movements
and online radicalization, these
are phenomena where we often
think about it almost
as if it was a black box
mechanism, where
someone all of a sudden
becomes radicalized
on an online platform.
But we don't really think about
what these new technologies,
what social media, do
to us as individuals,
to our identities
on a personal level,
but also on a group dimension.
So I really wanted
to get insights
into the inner workings of such
online movements that sometimes
then also have a presence
offline in the real world,
and also see how
their connections are
between the online world
and the offline world.
And so I started doing this
research around two years ago,
or actually now three years ago.
I went undercover with 12
different movements in total
for two years.
So I set up five
different identities.
I joined, for example,
ISIS hacking groups.
I joined jihadist bride groups.
But I also joined groups on
the neo-Nazi and far right
extremist side.
For example, neo-Nazi
groups in the US,
but also I was also in
the organization teams
that were behind the
Charlottesville rally in 2017
in the US, the white
nationalist rally that
ended in the death of one
of the counter protesters.
And I was also in some
of the networks that
inspired the recent terrorist
attacks in Christchurch,
but also in the
US and in Germany
that we've seen
over the past year.
And some more, probably
more fringe communities
that I also joined were the
female misogynist community
of the triad wives, a
community I hadn't even
really heard about, to
be honest, before that.
I hadn't really
encountered them,
having worked for five
years in this field.
That was something new to me.
And it seemed almost like
an oxymoron of encountering
female misogynists.
I will also tell you a bit
about that community in a bit.
And one last community that I
found interesting to look at
is the conspiracy
theory networks,
especially when looking
at last week's attack
in Germany in Hanau.
We're still seeing that the
investigations are still
ongoing, but from what
we can find in the letter
and in the documents that
the attacker left behind,
there is a strong overlap
between the conspiracy theory
communities and the
elements that we
find of conspiracy
theories in his documents
and the far right, racist,
white nationalist elements.
So a pattern that I found
across all of these groups
was, the whole process
of radicalization
was split up into
six different stages.
So it always started, of
course, with recruiting.
And then the next step, which
was much more important than I
thought it would be, was
the socialization process,
where actually some
of the indoctrination
and the ideological
manipulation comes
through a form of
group cohesion that
is created by almost
completely new subcultures
and counter cultures that
are taking shape, sometimes
purely online, sometimes
also taken offline.
But where these communities
create their own vocabulary,
their own insider
references and jokes,
and are able to really
appeal to quite a wide range
of different target
audiences by sometimes
gamifying the recruitment.
So this stage of
the socialization
was really important,
and I think
that's also one
of the key points
that the book
tries to highlight.
But then of course,
also once people
are socialized into
extremist groups,
they also start networking
with like-minded groups
across the globe.
We've seen an
internationalization
of the phenomenon,
with especially also
of course online platform,
social media playing a big role
in giving these fringe
movements, sometimes very much
locally rooted, actually
a global platform
and allowing them to
spread across the world.
Last week's attack, the
perpetrator spoke in English,
addressed an American audience.
Likewise, the Christchurch
attacker last year, he also
addressed an
international community
and had lots of these
insider references
in his livestreamed video of
the attack against mosques.
And then the stage is
often public communication.
And that's something where I
was really shocked at times,
being for example undercover
with a neo-Nazi trolling
army, how sophisticated the
communication techniques are
that extremists are using.
But of course, we've seen
that on the side of jihadists,
when ISIS has been using really,
really sophisticated propaganda
campaigns, and has
sometimes been tricking
the algorithms to
get their propaganda
campaigns to a
really wide audience,
and where social
media has sometimes
lent these campaigns a megaphone
and made some of the really
small fringe movements appear
much bigger than they actually
are.
And the same is true for the
American alt right, where
they've staged really good--
or good in terms of the
communications strategy,
really successful online
campaigns in the run-up
to the US election.
And then in recent years,
we've seen the European art
right or the European new
right copying and imitating
some of these techniques,
using memes and using
sometimes satirical content to
spread really vile ideologies.
And then often we also
see the next step,
which is mobilization, a
mobilization to offline
real world protest movements.
For example, we saw the
Charlottesville rally
in 2017, the white
nationalist rally,
but we also saw the riots,
for example, in Chemnitz,
in the German city of
Chemnitz, where migrants were
being chased in the streets.
And actually the
mobilization happened
in encrypted messaging
apps on Telegram
in some of these chat groups.
Likewise, the Charlottesville
rally organizers,
I joined their
organization teams
that all took place on
the gaming app Discord.
So it is a
cross-platform threat,
because we then also
of course see them
sharing links to YouTube, links
to Facebook and the bigger
channels, carrying out some
of the campaigns on Twitter.
But sometimes it
also takes place
on these more fringe either
anonymous forums like 4chan,
8chan, or in the
encrypted messaging apps.
And the last stage
unfortunately also
sometimes happens, of course,
not in all of the cases,
not in all of the groups
that I looked at, is attack.
And either in the form
of real world attacks,
like the wave of attacks
that we saw especially coming
from far right terrorists
in recent months,
or on the jihadist side.
And also hacking
campaigns and attacks
the purely take place
in the online world,
including phenomena
like doxxing,
the leaking of private
details and, yeah,
the hacking of infrastructure.
I'd like to give you a
few examples or insights
into these different stages by
mentioning some of the stories
and some of the people that I
encountered in these movements,
both from the online and
offline undercover experiences.
In the first stage
recruitment, I first
was recruited into the
white nationalist network
of Generation Identity.
My goal with this was
to get more insights
into how their recruitment
campaigns worked,
what kind of people
would join that movement.
And this one was quite
an interesting example,
because Generation Identity
is now a pan-European movement
that has been quite skillful
in recruiting very young,
sometimes very educated
members into their networks.
And I had already built
up my online identity
in some of their chat groups.
I was already
present, and could see
that they were planning to
open a channel, an offshoot
in the UK and Ireland.
So I wanted to get
insights into what
their next steps would be for
recruiting new British members.
And so I reached out to them.
I submitted an application,
was invited for an interview.
First in Vienna,
because I was supposed
to be an Austrian exchange
student in London,
so that I could build that
bridge between continental
Europe and the UK.
And then I was invited to join
their first secret strategy
meeting in an AirBnB
in Brixton, and it
was incredible to see how
standardized their branding is.
So they had influencers and
leading members of Generation
Identity offshoots
from France, Germany,
Austria all come together
to this London lunch,
to brief new members about
their communication tactics,
about how to run
slick online campaigns
and do offline stunts that
then can be livestreamed and go
viral on social media.
They're also still up and
running on YouTube, just as
a small remark.
I do think that
they're a movement that
operates in the gray zone, so
that's something difficult,
of course, to challenge
from a policy side,
from a legal
perspective, and also
from the tech company side.
It is also remarkable
that they know
how to rebrand
themselves to stay away
from the traditional neo-Nazi
and Holocaust denial content,
and from any explicit
violence incitement,
and to even brief
their members on how
to respond to tough
questions from journalists.
For example, if they're
asked "are you anti-Semitic?
Are you racist?", they have
answers ready to reply to them.
And that this challenge
of rebranding,
we also see that, of course,
in their online materials,
and that makes it hard to
tackle their online content.
Maybe another example
from another group
that I joined in the US,
actually a neo-Nazi group
that was operating on
the gaming app Discord.
They had introduced even
more rigid vetting procedures
and recruitment procedures,
especially after they found out
that a lot of investigative
journalists and security forces
had infiltrated their channels
after the Charlottesville rally
and after the threat from the
far right became more evident.
And so they, for
example, asked me
to submit a timestamp
picture of my wrist
to prove that I'm
white, or they also
asked members to submit the
results of a genetic test.
I mean, they're taking
it to the next level.
They are using different,
completely new forms
of online recruitment, actually.
They never met in person, but it
does show how connected, also,
some of these groups are
with the newest technologies,
even with genetic test
results and things like that.
And a lot of it is
also gamified now.
Both recruitment, but also
communication and propaganda
campaigns, and I think one
of the most shocking examples
of this was a neo-Nazi
trolling army.
We see quite a few of those,
but a neo-Nazi trolling army
that I joined in
Germany that had 10,000
members at a certain stage.
And in the run-up to the
last German elections
was quite successful in
completely manipulating
and twisting the
online discourse
by coordinating
campaigns in, again,
closed channels, closed
channels on Discord,
where they, for example,
would agree on a time.
For example, say today,
tonight at 7:00 PM,
everyone attacks this
political opponent
on Twitter or on Facebook.
And they would be
very effective,
even if they were on only
a few thousand members who
were participating
in those campaigns,
they made an impression
that actually catapulted
their hashtags, for example,
into the top trends in Germany
in the two weeks' run-up
to the German elections.
And by doing that,
they can, of course,
exercise a lot of
pressure on individuals
that they try to silence,
intimidate their critics
and journalists,
but they can also
twist some of the political
discussions into a direction
that they want to have
or they want to see.
And similarly, I
also experienced more
on a personal level
how much force or power
such campaigns can have.
I published an article,
that was now two years ago,
in "The Guardian," where I
mentioned by name the English
Defense founder, Tommy Robinson,
who you might have heard of.
And I mentioned him as an
example-- or his Twitter
account as an example
of how white supremacist
movements have increasingly
become mainstream.
And on the day after I
published this article,
he came to my then-office.
Back then, I was
working for Quilliam,
the counter
extremism think tank.
And he barged into our
office and livestreamed
the whole confrontation to his
back then 300,000 followers
on Twitter.
And this was, of course,
followed by a massive hate
and harassment campaign.
And in effect, it
led to my dismissal
from the organization,
because I was
asked to retract my
statements from the article,
to publicly say I regret
having written that article.
And I refused to do that, so
I was dismissed a few days
after that.
And this shows how much
power an individual,
even if it's an influencer,
a far right influencer,
can have over entire
organizations.
And this is just one example
out of many journalists
so that I also interviewed
for the book, who
have faced similar campaigns
after they criticized
extremists.
Extremists, of course,
from all sides,
but the far right
in particular has
been very good at leading these
communication and intimidation
campaigns.
And yeah, it is something where
we've seen quite a concerning
trend that also now increasingly
artists, political activists,
researchers, journalists,
and politicians
are being attacked by such
coordinated campaigns.
And we also saw in the UK that
some of the female, especially
female politicians are facing
quite a lot of threats,
and recently that led
some of the female MPs
to even not take
part in the election,
in the run-up to
the election when
they were facing these threats.
So it can have a big
political impact.
Ah, one thing I
forgot to mention,
but that is quite interesting
from the whole gamification
perspective is that you
can see on the left,
this is the way that Reconquista
Germanica, this German neo-Nazi
trolling army structured their
own operations was almost
in military-like hierarchies,
where you were rewarded
for carrying out a particularly
successful campaign,
for example, against
political opponents.
They even gave
military-like names
to some of their
forms of attack.
So they would call
an online attack
on a political opponent
a sniper mission.
Yeah, they would use
all of these names
to, actually, to
really incentivize
people to take part
in these campaigns
and see it almost as a game.
So here you had the hierarchy
showing the Supreme Commander,
who would then give
orders to the foot
soldiers of this trolling army.
And this is quite
similar to the ways
that other trolling armies
also in other geographies
are working.
Networking, the next stage.
For the next stage,
I looked at the whole
of the cross-platform
ecosystem that we've
seen emerge in recent years.
There's a whole range
of new platforms
that extremists have partly
created by themselves,
or ultra-libertarian
platforms that
have been used for
extremist purposes,
because they wouldn't
cooperate with the authorities.
They wouldn't remove even
the most extreme content
or violence-inciting content.
There there's a whole
range of platform
that would fall
into that category.
For example, Gab or Minds have
become alternative social media
platforms that are now
turning into hotbeds
or have turned into
hotbeds for extremists.
Then there are
also platforms that
have been hijacked by
extremists, like the gaming app
that I mentioned, like
Discord, or also other apps.
In the case of
jihadists, for example,
the content-sharing
from JustPaste.it,
which essentially was
a one-man business,
was hijacked by jihadists
for their propaganda.
So something similar also
occurred on the jihadist side.
And increasingly,
they've also been
building their own
in-house tools, where
they create encryption methods,
or their own apps, chat apps.
For example,
Generation Identity has
been trying to create its
own communication tool,
which they called
Patriot Peer, which
would be almost a mixture
of a Tinder for Nazis,
it's been called like that,
and just a simple social media
app, or a chat app.
And then there are
also other platforms,
like alternative
crowdsourcing platforms.
After some of the extremists
had their accounts closed
by Patreon, by the crowd
sourcing platform Patreon,
they created the platform
Hatreon, which has now luckily
been removed, but that
shows how innovative
they get with creating
their own platforms.
There are also alternative
dating platforms.
And this is a whole
new universe that
is, I would say, posing a
completely new challenge.
Because we see that a lot
of the policy responses
right now are focusing on the
bigger mainstream platforms
that are known to everyone.
And I do think that there
is a really important role
to play for these,
because these are often
used as initial ground
to recruit members
into these smaller platforms.
That's also why
it's so important
that extremist material
gets quickly removed
from platforms like YouTube,
Twitter, and Facebook.
But these alternative
platforms sometimes
have been a bit under the
radar of the security forces
and of the policymakers and
lawmakers in recent years.
And for example, to
the NetzDG legislation,
the first anti-hate speech
law in Germany, only concerns
companies or platforms with
over 2 million users, which
a lot of these platforms
wouldn't reach,
because they're are
smaller fringe platforms,
that nevertheless
have really attracted
a really high amount and a
high concentration of extremist
content.
The next chapter, the next
part is about mobilization.
So I went to a neo-Nazi rock
and mixed martial arts festival,
which is possibly the
biggest in Europe,
at the border of
Germany and Poland.
And it was one of
the experiences where
I, for the first time, realized
that actually online platforms
and social media also now
plays a role for these quite
traditional neo-Nazi
networks, that in my head
were not really connected to
any of these new technologies.
Because a lot of these people,
contrary to the newer far
right movements and
the jihadist movements,
they are still quite traditional
in the way that they look.
Not everyone, but a
lot of them are still
wearing the typical
skinhead outfits.
They're organized in the more
locally rooted way than some
of these international loose
networks of extremists,
whether that's in the form
of ISIS and al-Qaeda cells,
or in the form of alt right
networks on the internet.
And these guys,
surprisingly a lot of people
who participated at that
event that I spoke to
had actually become
aware of this whole scene
by looking at the rock or
mixed martial arts videos,
for example, on YouTube.
And that was something
that only then
I connected this to this very
traditional movement, where
they've now managed
to attract higher
numbers at these
European festivals,
with people coming
from Poland, coming
from other countries in Europe
to join this neo-Nazi festival
in Germany.
So also in this case,
I do think that we're
seeing new developments, where
online platforms and tech
firms can play a big
role in countering that,
and in tackling some of the
content that is actually
trying to hijack
hobbies sometimes
radicalizes people
who might not even
be politically interested
or ideologically tainted.
So they're going
through either the music
angle, or the mixed martial
arts angle, or the gaming angle,
and they're
attracting new members
through these
different gateways.
So I think it's really
important for these gateways
to have a good policy
regarding these.
Of course, not
shutting down content
that isn't explicitly
extremist, or
harmful, or
violence-inciting, but it
is important to watch out for
these manipulative techniques
that extremists are using.
Another example is brands.
Also, something I came
across at the festival
was nipster brands, or
Nazi-hipster brands.
So they've now
also, for example,
created a lot of
online shops where
they sell t-shirts with
slightly twisted swastikas.
In Germany, as you
know, it's forbidden
to have any kind of
neo-Nazi symbols,
and so they're trying everything
to circumvent existing laws
to create an attractive or
appealing counterculture
movement that uses hip and
new elements of existing,
for example, fashion
or arts movements,
and integrates their
ideologies into that.
And finally, the
last stage, attack.
One of the turning points
for me in writing the book,
and I had already
finished, pretty much
finished the book
at this stage, was
when pretty much
exactly one year ago,
the attack happened in
Christchurch in New Zealand,
where the perpetrator carried
out two separate shootings
at two mosques in Christchurch
and livestreamed his attack.
And it was really
shocking to then read
his so-called manifesto,
or the documents
that he had left behind,
and finding so many elements
that I had encountered
a thousand fold in some
of these online networks.
A lot of these subculture
trolling elements,
a lot of satire and
jokes that he referenced,
where he really tried to
appeal to this sub community,
to this relatively
small community,
but international community,
and where even the video
itself was shot from
almost like a first person
ego-shooter angle,
and was an attempt
to get the applause
from that audience.
And this also worked, as we
could then see in the aftermath
that the video was turned
into several versions, where
for example, terribly,
people then gave points
for every Muslim that was
shot in new versions that
were created and
circulated of this video.
And that also is really
dangerous in terms
of inspiring copycat
attacks, because that
made it all seem like a game.
And you could observe how some
people who were then commenting
on this livestream
video couldn't even
grasp that this was
actually really happening.
So the lines have become so
blurry between what is real,
what is just an online game, and
what's actually the real world.
And there is a big
responsibility for any tech
firm, for any online
platform, I think, yet
to also consider these new
forms of online identities,
online communities, and group
dynamics that are taking shape,
and to address some of
these subculture elements.
And for example, the first
comment beneath the livestream
video was saying
"is this a LARP?"
Is this a life action role play?
So you could tell this
person can't really
grasp that this is
actually happening.
And then the next attack
happened a few weeks--
I think it was only
a few months later
in Poway, at the synagogue
in Poway, and then in El Paso
a few weeks after that, and then
we had the attack in Germany,
in Halle.
And then now we had the latest
attacks in Hanau, the latest
attack in Hanau last week.
And a lot of these attacks, or
at least the ones up to Halle,
all followed a very
similar pattern,
and even referenced
the previous attacks.
So it was a new form
of copycat terrorism,
very similar to the copycat
or inspirational terrorism
that we saw in
previous years inspired
by ISIS networks, where we also
had one attack after another
with similar elements.
And the gamification elements
were present in almost all
of these far right
terrorist attacks.
After the attack
happened in Poway,
the first comment also
again beneath the post
read "get the high score," again
in reference to video games.
So there is this
recurring pattern,
and there is a real danger of
gamified terrorism occurring
not just on the far right
side of the spectrum,
but also in other forms, in
other extremist movements.
Christchurch was really
the first example
where we didn't just see
gamified recruitment.
ISIS had used that
previously in campaigns
or gamified propaganda,
even using sometimes
language like the "call for
duty," when referring to jihad,
mixing gaming language
with jihadist language.
But this was the first
case of the actual act
of terrorism being gamified.
The other future
threat that I want
to point to, having been in one
of these ISIS hacking groups,
where actually they were
not very sophisticated,
but there were two teachers who
tried to teach us in this group
hacking skills to then hack
Western infrastructure.
And they did manage, or
everything pointed to the fact
that they were also involved
in the hacking of hundreds
of schools in the US
that were being hacked.
Whenever I saw
something threatening,
I obviously forwarded it to
the responsible authorities
to investigate.
But there were reports then
of up to 800 US schools
who had their websites
hacked and replaced
with ISIS propaganda.
And this was quite an
easy, low skill hacking.
But there are, of course, so
many, so many websites and also
in general
infrastructure devices
that still use the default
passwords, like admin12345,
that they were able
to exploit that.
And now this in this
case, no one died,
nothing serious happened.
It's still bad enough, I think,
for parents to send-- no one
wants to send their child
to a school that had just
been hacked by ISIS, so
it did create that fear
on the psychological level.
But there could also
in the future of course
be more severe hacks,
hacking campaigns
carried out by
extremists that would
go against critical
infrastructure.
So these are, I would say,
some of the future threats.
Other threats would come
from other new technologies,
especially deep
fake technologies,
where it's possible to
completely manipulate
videos in retrospect,
make politicians
say something they never said,
or AI-based text production
tools, where you can produce
texts that look exactly
like they were written
by a famous journalist,
whereas actually they were
completely AI-produced.
So these things can
really be, of course,
exploited and used by extremists
to lead misinformation,
disinformation campaigns.
And finally, I think
in terms of solutions,
I know this is all quite
a pessimistic outlook,
but I am quite positive
in the long run.
I think I'm quite
pessimistic in the short run,
because we're still in the
very early stages of tackling
this problem.
I know that Google and YouTube
have taken steps, especially
in the last year, even just
going on my extremist YouTube
accounts, I can see
a difference in terms
of the recommendations, the
content recommendations.
I can see that there
have been some changes
with some extremist
content being taken down.
Actually, across a lot
of the bigger platforms
this has happened, but
there is still a lot of work
that needs to be done.
And at the Institute
for Strategic Dialogue,
we've worked also with
Google, with Facebook, also
with policymakers to
tackle this challenge.
Ultimately, I do think there
is a responsibility for tech
platforms to really
take proactive action,
and I'd love to see more
of that, also for example,
supporting civil
society-led initiatives,
that might tackle some of what
we find in the gray zone areas,
where we can't
really use policies,
because we don't
want to have policies
that might look
like infringements
on freedom of speech.
So there is a lot of
content that, of course,
can be considered
harmful but doesn't
fall into an
illegal space, where
I think there is a real role
for companies like Google
to play to support some of
the civil society initiatives.
I know that Google is
already doing some of this,
but there are always
new pilot projects
that I think need
a lot of support,
especially new intervention
models that might also
tackle some of these online
subcultures on the fringes
or in the darker
corners of the internet.
But because this is such
a cross platform threat,
where some of it might
be actually sometimes
beneath a YouTube video.
There might be a link shared
to a smaller platform that
then leads people to go there.
I saw this quite often,
that people recruited
on the main platforms to then
lead people into these darker
echo chambers, and lead people
into these encrypted forums.
So yeah, on a positive note,
I think in the long run,
this is something
we can tackle, and I
do see lots of great
civil society initiatives.
And I see also we've worked
with you, with Google,
on the global Impact challenge.
And I think that there
are initiatives that
are appearing that are good.
But we still need to put a
lot of effort and, I think,
thinking into how we tackle
these online extremist echo
chambers.
Thank you.
[APPLAUSE]
Yeah, questions?
AUDIENCE: When you have
to immerse yourself
in so much darkness, how
do you maintain who you are
and your own beliefs?
How do you stay
positive, basically?
JULIA EBNER: To be very
honest, it was sometimes
difficult to distance
myself, and also
to switch between the
different identities.
Because, of course,
especially remaining
a presence in these online
chat channels also means
it's hard to then
be away for a week,
or completely step
away from this.
During the time of the different
undercover investigations,
I had to be present
in the groups.
So that was that, and
it still is challenging.
And I think a lot of people who
deal with these dark subjects
also when monitoring propaganda
materials underestimate
the long term creeping impacts
and traumatizing effects
this can have on our psyche.
In the end, it
also gave me a bit
of hope, because I could also
see the human dimensions.
Even in the most extreme of
the extremist channels and chat
groups and among the very
extremist propagandists,
I could still find some
kind of common ground, where
I think this could
be really used
as a starting point
for new intervention
models in online spaces.
Also one of the
aims of the book was
to essentially also show that
these online subcultures have
a lot of human
dimensions to them,
where hopefully we
can start at these
and also address some of
the fears that are very
present in these communities.
But yeah, I agree I think,
at times it was difficult,
and sometimes I also thought,
"maybe I've gone too far.
Maybe I should stop this
whole project right now."
AUDIENCE: Yeah, did
you, in your experience,
ever find the experience
actually seductive,
like you started to
sympathize, or you
started to feel like you
were going down a path?
The reason I ask is, I guess
if you consider yourself
someone who thinks
about all these things,
you wouldn't be swayed
by it, and it's normally
the vulnerable or someone
looking for something,
but is it actually
that this kind of thing
is seductive to everybody,
and you felt sometimes
you might go too
far, as you say?
JULIA EBNER: Absolutely.
Even before starting
this research,
I didn't really believe
in any kind of profiling
or in extremist profiles.
I still think that
anyone can be attracted
to different forms of
extremist movements
in weak or vulnerable moments.
But even more so, I started
to really feel that and get
that sense after I was
actually also at one point,
I would say not at
the verge, but I
could feel how I was being
drawn into a community.
And I briefly just mentioned
them in the beginning,
but there was that female
misogynist community.
I mean, I consider myself
quite liberal and quite really
on the other side of the
ideological spectrum,
and yet when I spent a lot
of time in this community,
and they were addressing
grievances and topics
that I could relate to.
It was not like in
the jihadist channels
and also the neo-Nazi
channels, I couldn't really
identify myself with much
of what they were discussing
there, but in that
channel, they were
talking about challenges
that matter to me as
well on an emotional level.
I was also in quite a
vulnerable position at the time
when I did this
particular investigation.
So they were, for
example, addressing
double burdens in modern
lives of women today,
or the challenges of the
hookup, the so-called hookup
culture, or online dating apps.
These were things that, yeah,
also seemed like a challenge
to me.
And the way that they
provided really easy solutions
and also formed a community
where women helped each other,
I can see how this
can be appealing,
especially at difficult
times, but for anyone.
In all of these
groups, there were
people from different
educational backgrounds,
from different
socioeconomic backgrounds,
also from different age groups.
Of course, some groups
focus more, for example,
on younger audiences.
Some groups try to tap into
the socioeconomic injustices
and would rather address
working class communities.
But overall, there
seemed to be no profile.
AUDIENCE: Thank you
for your presentation.
It was very interesting.
And honestly, I
admire the courage,
because we all have curiosity
but not enough courage
to investigate on this.
My question was
regarding the screenshot
that we saw when you went
to interview with Generation
Identity, I think,
when they discovered
that you were Julia Ebner.
So how did it happen
that they discovered?
Was it at the moment when you
were interviewed or after?
And when your name was
exposed on the social media,
did you get any threats?
And the second
question, I assume
you get loads of threats,
but how do you cope with it?
Do you get paranoid when you
get a huge amount of threats?
And did you have situations when
you had to go to the police,
fearing for your life?
Thank you.
JULIA EBNER: I definitely
did get paranoid at times,
and I did have quite a lot
of campaigns in the past.
Whenever there is
a new publication,
I do already mentally prepare
for new waves of threats.
Usually it comes and
goes in waves, but waves.
Yeah, usually, I mean,
they can range, right?
They can be harassment
and trolling campaigns,
or they can be serious
death and sexual threats.
And I've had both.
And I think there are quite
a lot of civil society
organizations that
offer help now.
Especially, for
example, in Germany
there is an organization
called Hate Aid,
or an initiative called Hate
Aid, that's actually trying
to help vulnerable people.
And that's now journalists,
political activists,
even artists, who might become
targeted by these harassment
or intimidation campaigns.
I do think that there
is a real need for that
in other geographies as well.
But after being
exposed, and that
happened a few times, that's
not a nice feeling, of course,
and there was a campaign.
But I did know that
there was always
an expiry date, because I also,
in this case, the Generation
Identity case, they
actually exposed me
quite quickly after
these meetings,
because I then published
a report at ISD looking
at Generation Identity
more generally,
but yeah, looking at their
mobilization campaigns.
And they read-- they read
pretty much everything
about themselves, because
branding and reputation
and their image matters to them.
So they matched my name
with my fake identity.
And yeah, and I was also-- not
all of the extremist movements
that I looked at or
that I went undercover
with made it into the book,
because for some of them
I was just kicked out too
quickly of, for example,
some of the online channels,
because they exposed
my identity or they exposed
that I was an infiltrator too
quickly.
AUDIENCE: So just following
on from that point,
can you talk a bit about how and
when you reported activities,
or how you balanced the
need to report potentially
illegal activity you
came across with a desire
to keep your
anonymity and the need
to protect your own
anonymity, but also
not give away the fact there was
a mole on some of these groups?
JULIA EBNER: So in general, I
would report anything that's
a concrete threat or where
I see someone-- of course,
if I see someone
plotting an attack
or plotting a hacking campaign
or something like that,
I did immediately
report it, if I
could tell which
geography they were
in to the responsible
authorities.
That's not always
straightforward,
because of course,
sometimes you just
don't know where these
people are based, even.
And also to the tech firms.
So sometimes I
flag content, if it
was being spread
on a big platform
and I want to reduce the
damage, in general at ISD,
we have a policy of flagging
really violence-inciting
content to the tech platforms
on which it happens.
Then there were cases,
of course, where--
when reporting about
extremists in general,
I try to keep everyone's
privacy, especially
of non-public figures.
I wouldn't mention
their full names.
I wouldn't mention any
details, any personal details,
because I also want to avoid--
I also want to leave the
option for all of these people
to actually get back
into mainstream society.
And I do think that that
can have a counterproductive
effect if then the media
reports their full names
and also gives them that
additional platform,
but also prevents some
of the newer members
to actually leave the
movement, because they already
have their face out there then.
AUDIENCE: So it's really
positive to actually hear
you saying you're optimistic
about what's to come
and all the
preventative things that
are being done by companies
like Google and so on.
And you touched on
it, actually, in terms
of people who are trying to go
back into mainstream society.
And there have been instances
in the media of people
who have tried to, and they
face loads of backlash.
So my question is,
is there anything
that you think that maybe
companies or even individuals
or communities can
do to help those
who were trying to leave
these extremist groups
and trying to go back
into mainstream society
and not face this backlash?
JULIA EBNER: You mean the normal
people, people on social media,
like civil society, or users?
AUDIENCE: Yeah, just
in general, or just
even like tech companies.
JULIA EBNER: I'd say, one of
the goals was also with the book
to just raise awareness about
the different techniques that
are used by extremists.
So for everyone to be
able to protect themselves
from buying into a
piece of disinformation,
or from being targeted
by, or from being impacted
by either manipulation
campaigns,
or also radicalization
campaigns, or intimidation,
or manipulation campaigns.
But beyond that, I do think that
everyone has a responsibility,
just like we intervene
sometimes in offline settings
and we show a civil courage
in offline settings,
to show the same amount
of civil courage online.
And I'm not seeing that
yet to the same extent.
Someone would intervene.
In most scenarios,
someone would intervene
if someone's aggressed
or attacked on the tube,
for example, on the basis
of their race, religion,
or whatever.
Someone, most bystanders
would somehow intervene,
say something.
And that's not really
happening to the same extent
yet on social media
or in online spaces.
So I hope that in the
future, that there
will be a bit more of
online civil courage
that we're seeing.
AUDIENCE: Thank you.
This is more sort
of a statement,
but I'm curious in your
opinion, is the problem here
the fact that we have extremist
point of views as a species,
or is the problem here the
fact that they're secret?
So i.e., if we'd not let these
people be secret and actually
welcome them in
society and accepted
that people will have these
extreme points of views
and that's just the way life
is, would it then be that,
because they're
not secret anymore,
it doesn't build into the
next stages of each stage,
if you know what I mean,
ultimately into attack?
Again, it's just a thought.
I mean, yeah.
I welcome your thought.
JULIA EBNER: It's
difficult. I think
it's a very tricky relationship.
There have also
been initial studies
that tried to, for example,
look at the correlation
between having these views
in more mainstream settings,
actually having far right
populist parties sometimes even
echoing these echo
chambers and ideologies,
using the same language
in parliaments.
I would say overall from the
observations that we've made,
it usually gives the really
extreme fringes a platform.
It gives them justification,
additional legitimacy.
It usually helps them to
further spread their ideologies
and to further also
incite to violence.
But there are studies that
say the opposite is true,
that providing them with
a political solution
might decrease the risk
of people resorting
to violent means.
I haven't seen enough
evidence to back that claim,
to be honest, but
it's a very, I think,
multifaceted relationship.
In the end, I think
it is about some
of the conspiracy theories
that clearly propagate
an existential threat to
an in-group that might then
motivate individuals to resort
to violent action almost
as-- they would see it
as a defensive action,
whether that's in the form
of a defensive jihad that's
done in response to a
perceived threat from the West
against Islam, or whether
that's in the form of almost
at this accelerationist
ideologies,
that the far right and
the white supremacist
movements have shown, where
they try to accelerate what they
would see as a coming race
war, or this existential threat
of the idea of a
great replacement,
that was actually driving
some of the last far right
terrorists, where they think
that white populations are
being gradually wiped
out or replaced by people
with migration backgrounds.
I mean, these conspiracy
theories are dangerous,
because they are presenting an
idea of an imminent threat that
must be countered with action.
And I think having
those, I don't
think it's a good idea to give
them an additional platform,
or to have them
normalized within society.
I do think it's important
to tackle those.
But the boundaries
are fluid, right?
It's such a difficult
question of where
to draw the line, of what should
still be considered freedom
of speech.
It was the same within those
female misogynist communities,
where you could argue
that some of their views
such as being
ultra conservative,
or they want to go back
to traditional gender
roles of the 1950s,
or earlier than that,
which everyone can
hold that view.
But some of them
were also feeding
into the misogynist networks of
the manosphere movement, where
you could see that, for example,
the incels movement has given
rise to violent attacks
and to terrorist attacks
against women.
And some of them were also
endorsing domestic violence.
And these things can have
real world implications,
and can be quite tricky on
a threat level, I think.
AUDIENCE: So as you
continue educating yourself
and become an expert within
this area, I assume--
I don't want to
assume, but I assume
that you're learning how to
digest it and cope with it.
And as you've just been able
to explain it, to a lot of us,
it's very mind blowing,
some of what we've heard.
What has been, or do you
think could potentially
be the most mind blowing
experience as you already
are very familiar with
this and understand
the psyche behind it?
I'm curious, what would
still shake you up,
if you learned about or saw?
JULIA EBNER: Hm.
That's a very
difficult question,
because it's usually the
things that are unpredictable,
in a certain sense, that are
the most mind blowing, that were
the most mind blowing things.
In this case of the
research here, it
was, for example, yeah, this
female misogynist community
was something that I found
shocking, because I'd never
encountered it.
I thought a female misogynist
sounded like an oxymoron to me.
But then also when Christchurch
happened, at the Institute
for Strategic
Dialogue, we'd been
warning of the
gamification elements that
were used in extremist
movements for years,
and also had been
warning about some
of the hijacking
of trolling culture
and memes by
extremists, but seeing
that really manifest itself in a
real world attack of that scale
was something I found
really shocking.
But in that sense, it
was not a surprise,
but it was still such
a shock, because--
So I think probably
when I have when
I look at the future threats
that could be foreseeable
or could be predictable, I
would say that a large scale
hack that really targets
critical infrastructure
could be something that
would be quite shocking,
and would have a huge
impact, of course.
I don't really see this
happening in the near future,
because I don't think
there are that many people
within those extremist
networks that have the hacking
skills to do it,
but the same could
be true for really
sophisticated use of deep fakes
and really running large scale
disinformation campaigns.
But I'd say that's
government actors
are more likely to have the
capacity and the knowledge
to run such campaigns.
I mean, obviously governments'
use of these technologies
is something also that I
wouldn't underestimate.
I actually consulted 10
counter terrorism experts
on what they would see
as the future, the most
likely future threats.
Yeah, almost the last
chapter, the last chapter
focuses on more
positive side solutions.
But a lot of them
were also warning
of government-led terrorism
or state-led terrorism.
And there might also be
new forms of terrorism.
To be honest, an
anti-tech movement
is something I consider
increasingly likely, seeing
how much hatred has built
up against big tech firms,
against technology in general,
because so many people have
a grievance, and often
rightly so, of course,
but that could also turn
into a more violent movement.
And yeah, these are things
that we might not necessarily
see coming in the
next few months,
but maybe in the future.
AUDIENCE: I just wonder what
your views are in relation
to whether you think as
a society the mainstream
media, etc might have a blind
spot for right supremacism
and right extremism, and
we are focusing excessively
on Islamist
fundamentalism, and we
might be normalizing some of the
more extreme views of the far
right?
JULIA EBNER: Yeah, I'd
say in many countries
we're seeing a shift.
Where I think the last year,
especially since Christchurch,
looking back exactly
over the last year,
we saw a lot more policymakers,
and also media outlets,
tech firms step
up their efforts,
in terms of also countering
white supremacist
and far right extremist
propaganda and campaigns.
There is still a big
time lag, and I'm
wondering if we're overlooking.
The problem is that, of course,
it's such a fluid space,
where extremists move
to different platforms
quite quickly, and
build up new networks.
And the problem is also that a
lot of the security-led efforts
and also the policies usually
focus on national borders,
whereas this movement,
or most of these networks
are so international
that they could rather
be compared to ISIS
kind of cell networks,
or to al-Qaeda-linked networks.
And that's a problem,
because our definition,
or most of the
countries-- actually
all of the countries'
definitions
that I'm aware of currently
consider far right extremism
still as a national, as a
domestic violence problem.
And even, for example,
on the terrorist list
of the United Nations,
you only see al-Qaeda
and ISIS-affiliated movements.
You don't see far right
extremist global networks.
But also we're
seeing new structures
of these networks that are more
loose international networks,
not really traditional
hierarchical movements
with local roots,
or not in all cases.
And that requires a
whole new definition
and an international
approach to this.
Because otherwise, it's
a bit like in the case
where National Action was
banned here in the UK.
Actually, now the
home secretary just
announced the bans of two other
movements today or yesterday.
But when National
Action was banned
as the first far right terrorist
group in the UK in 2016,
Twitter blocked all of the
propaganda, all of the National
Action propaganda for the
UK, but people could still
circumvent that by
using VPNs, or people
in the US from Atomwaffen
Division, for example,
could still access
that material,
because it was a
national response
to an international problem.
AUDIENCE: I just think that
even the language that we use
is important.
And if there is an
Islamist attack,
everybody very quickly
labels the person,
the individual as a terrorist.
And when there are
attacks, and very recently
there are attacks
from the far right,
and we are very hesitant to
label people as terrorists.
We tend to identify them as
lone wolves, or individuals
with mental health problems.
And I view that as
deeply problematic.
JULIA EBNER: Absolutely.
I completely agree,
and that was also
the case in the recent attacks
even last week in Hanau,
in Germany.
At least judging
from the materials
that the perpetrator
left behind, of course,
there might always be a case of
mental illness playing a role
among a lot of or a high
proportion of terrorists,
also on the jihadist side.
But in this case, that dimension
was highlighted in some media
reports, and also, yeah,
by some political voices,
whereas actually, when
looking at the materials,
they were explicitly
ideologically rooted
in some conspiracy theories.
There were clearly, clearly
racist and even genocidal
language and references in the
letter that he left behind.
And I'd agree that
it's dangerous to have
an inconsistent approach
to naming and highlighting
specific topics and angles
for one kind of extremism,
but not for the other.
And yeah, I'd say
that that's a big part
of the problem that also,
of course, is exacerbated
by seeing more and more
political parties, both far
right populist views
being represented
in leading government
positions and in parliaments.
Thank you.
[APPLAUSE]
