Time to unwind with a little Facebook.
Facebook continues to be under fire
for failing to crack down on fake news.
It's been called a haven for fake news.
Russian trolls used Facebook to exploit racial
tension.
Did you fall for propaganda from a Russian
troll?
Jesus.
Facebook is dead.
YouTube!
YouTube is the latest social media company
under the microscope.
Ads running on YouTube channels that
promote white nationalism.
Trending tab featured a conspiracy video
that attacked survivors of the Parkland shooting.
God.
Never mind.
Time for thirst traps on Instagram.
Russian bot ads appeared on Instagram.
Thousands of ads posted on Instagram
meant to divide Americans.
Okay, that's it.
I'm going back to my true safe space.
The one place where nobody can hurt me.
Pinterest.
OH.
MY.
GOD.
WHAT HAPPENED TO PINTEREST?
There's nowhere to go anymore.
Every social media site has become a dumpster
fire.
Why is this happening?
What's turning all these sites into fever
swamps?
Aw crap.
It's me, isn't it?
Tech’s biggest companies are once again
the platform for conspiracy theories.
Hate speech
Fake news
Forcing YouTube and Facebook to apologize.
That was a big mistake, and I'm sorry.
Executives admitted their limitations
but promised they will do a better job.
Let's start at the OG misinformation platform.
No.
No.
What?
No.
The human brain.
Humans are social animals at their root,
and they're constantly looking for reinforcement
signals
or signals that we belong.
Jay Van Bavel researches what kinds of information
humans respond to on social media.
Van Bavel did a study last year tracking
what kinds of tweets were more likely to go
viral
when it came to divisive issues like gun control.
And he found that tweets that used moral/emotional
words,
words like "blame," "hate," and "shame"
were way more likely to be retweeted
than tweets with neutral language.
One of the reasons we think this is happening
is because
when you're using that type of tribal language
it sends a signal about who you are,
what you care about,
and what group you belong to.
This makes sense if you think about it.
If you're trying to signal to others that
you're
a real Ariana Grande fan,
you don't say, "I personally enjoy Ariana's
music."
You say, "Ariana Grande is the best singer
of all time.
Anyone who disagrees is an idiot."
You know, like, hypothetically.
Are you playing with imaginary hair?
That tribal desire to organize into "us" versus
"them"
is a basic part of human nature.
It's why we have hardcore nerd fandoms —
Cuz I am a Gryffindor! —
and lose our minds at sports games.
Yankees suck!
Yankees suck!
But when it comes to politics
that desire can push us toward some extreme
views.
One hypothesis is that when people are sharing
the most extreme forms of political content,
that sends the strongest, clearest signal
about what their identity is, and it signals
very clearly
who the outgroup is.
You can see that tendency when you look at
which US senators have the most Facebook followers.
The further left or right, the more followers.
When we rally around these politicians,
it leaves no doubt about which tribe we belong
to.
If they're sending information that's moderate,
it doesn't clearly signify who their ingroup
and outgroup is.
And so there is this incentive structure potentially
to share more and more extreme information
to signal more and more clearly who you are
and who you affiliate with.
But these loud signals of our group status
are way easier to do online than in person.
And that's because in the real world
there are social costs to being a jerk.
So normally when we're talking about things
like politics
with our friends at the bar or with our family
at Thanksgiving
we have social checks in place that send us
signals that
maybe this isn't landing well with everybody.
Maybe it's resonating with your sister or
your brother,
but your mom and dad are giving you the stink
eye.
We get these signals all the time from people
that we're excluding them or that we're rubbing
them the wrong way,
and if we value those relationships
we tend to tone down our language.
And this is where the problem with social
media starts.
Platforms like Facebook, YouTube, and Twitter,
they're designed to do one thing:
keep you on the site for as long as possible.
The more time you spend on the site,
the more commercials, sidebar ads,
and promoted tweets they can show you,
the more money they make.
If your goal is to get the greatest amount
of engagement with an audience,
you need content that's going to be addictive.
In terms of politics and news,
stories are laden with emotion,
that connect to our identities,
and is morally arousing are the types of stories
that are going to get people engaged the most.
The problem is getting the stink eye
is a really unpleasant experience.
We don't like being told that we've
crossed a line or gone too far.
And we are less likely to stay on a website
that makes us feel that way.
So social media sites have been designed
to protect us from stink eye.
To cater to our tribal nature by
figuring out what we like and showing us more
of it.
By identifying what products and people and
politicians you like,
they can identify with some degree of certainty
what your politics are and then
feed you back more and more information
that confirms those beliefs.
It's not just algorithms doing this.
These websites invite us to sort ourselves
into tribes.
We get to follow and subscribe to people we
agree with,
block sources of information we don't like,
and literally join groups of people
who think the same way we do.
You get content that confirms your beliefs
and doesn't challenge you,
and then it's a dissonance-free environment.
You don't have to face up to individuals
and people who disagree with you.
"Dissonance-free environment" is a fancy way
of saying
a place where you don't get stink eye.
That study about how polarizing tweets
got more engagement,
it also found that those tweets
rarely left our echo chambers.
We're getting tons of positive feedback
from people who already agree with us.
And all that positive feedback
can push us further to the extreme.
If I share some extreme political content
and it gets a lot of likes, I realize because
I've been reinforced
that that's what people in my social network
like
or that's what's more likely to go viral.
And then I might try to match it
or make my next post even more extreme
to get more reinforcement.
And if you're looking for something more extreme
to share,
these platforms will help you find it.
Watch a few anti-immigrant videos —
This is totally out of control —
and YouTube's algorithm will start recommending
videos about white genocide.
Join a few pro-Trump groups,
and your Facebook feed will fill up with smear
campaigns
and conspiracy theories.
The 9/11 attacks themselves were orchestrated
by the Bush administration.
Even search for information about vaccines
on Pinterest,
and your homepage will be full of anti-vaxxer
bullshit.
And so you can see people potentially
being led down this pathway of more and more
extreme posts.
There is a social reinforcement system
that would otherwise take a long time
through lots of interactions with people in
our community
that can now be done very rapidly
and at an enormous scale.
All of this makes social media sites
goldmines for con artists,
conspiracy theorists, and trolls
who exploit our tribal mentality to get clicks
and views.
Anybody can write a blog, however incendiary,
and
if it has a catchy title or catchy content,
people are going to share it.
Kaepernick is an attention-seeking crybaby
who takes out his perceived oppression
on the flag and national anthem.
I stand for our service members, our veterans,
our LEOs
and our first responders.
Not for the indulgent a-hole who disrespects
them.
Follow me on Twitter and Instagram.
It's a terrifyingly effective strategy.
A 2016 Democratic strategist said that
when it came to which kinds of ads performed
best on Facebook,
"ugly and incendiary won every time."
It's a real tough life if you say you are
a liberal.
Trump train moving ahead full steam.
It ain't too late if-
The same is true for conspiracy theories and
fake news stories.
One study looked at 10 years of true and false
stories on Twitter.
The authors measured what they called "retweet
cascades":
chain reactions where the original story
is shared and retweeted to a much larger audience.
And when they compared the cascades of real
stories and fake stories,
the fake ones reached thousands more people.
And it didn't matter that these stories
were coming from small accounts.
Anybody could go viral if the story triggered
enough of a tribal response.
In 2016, it was teenagers in Macedonia
making thousands of dollars publishing
fake election news on Facebook.
After the Parkland shooting, it was random
YouTubers
going viral by accusing students of being
crisis actors.
The Russian trolls messing with our elections?
They're not superhackers.
They’re people posting low-quality,
highly emotional content that they know will
go viral.
The Russian playbook exposed the architectural
flaws
in products like Facebook, Instagram, YouTube.
Anybody can run that playbook.
So far social media companies have responded
to this
by trying to punish bad actors.
Facebook has suspended hundreds of pages
tied to a Russian group.
Social media companies are banning Alex Jones.
Twitter has banned Milo Yiannopoulos.
But punishing individual bad actors doesn't
change
the incentives that brought them to the platform
in the first place.
One fake news writer told the Washington Post
that
if Facebook cracked down on his content,
"I would try different things.
I have at least 10 sites right now.
If they cracked down on a couple, I'll just
use others."
And this is why getting rid of Alex Jones
if you ban him, someone else will realize
they can
get rich pushing the same type of agenda for
a period of time:
followers, clicks, advertisers, speaking fees,
and other opportunities that are incredibly
lucrative.
The problem with social media isn't
that a few bad apples are ruining the fun.
It's that these sites are designed to reward
bad apples.
And until these companies decide that
there's something more important than getting
people to watch ads,
we're going to keep seeing the worst of human
nature
reflected back at us.
