So let’s talk about facts.
A fact is something that is actually, objectively true.
But sometimes, you might not want people to
know those objective facts - maybe they make
you look bad, or you have something to hide,
or maybe you’re just trying to protect people
from something that you don’t think 
that they should know.
Whatever the reason, when you want to filter
the truth to fit your needs, some propaganda
techniques are pretty popular.
I’m Moti Lieberman, and this is The Ling
Space.
People have probably been bending the truth
for as long as there have been people.
As the saying goes, history is written by the victors.
Tweaking the story to make you look good just
comes naturally, and even little kids will
try to lie to get out of trouble.
There’s been some interesting linguistic
research on lying, and we might come back
to that in a later episode, but for now we’ll
just focus on the idea that there are different
kinds of ways to lie.
Sometimes it can be pretty direct: if Katniss
tells people that she’s going to have a baby,
when she knows full well that she’s not
pregnant, she’s saying something that she
knows isn’t true: she’s making a statement
while believing its opposite.
That’s a pretty bald-faced lie, but it definitely
has a place in the propaganda toolkit.
When people talk about “fake news”, for
example, this is usually the type of out-and-out
fabrication that they’re talking about.
The term “fake news” has actually been
around for over a century, but it’s become
a lot more widespread lately, to mean just
this: stories made up of whole cloth and presented
to the public as news.
Satire websites like The Onion often do this
as a joke, but often, fake news has an less-than-funny
ulterior motive, like to make someone look
bad or as false evidence to back up a point
that you’re trying to make.
It can be tricky to tell apart fake news from
real news, especially if you’ve already
got an opinion going in, as most of us do.
But ultimately, checking sources and doing
a little research can take you a long way
towards the truth.
This gets harder, though, when we start to
look at other ways of lying.
For instance, instead of making up stories
entirely and packaging them as truth, you
could take actual facts or events and just frame
them in a way to try to shape people’s opinions.
For example, let’s say that every year you
organize a brutal fight to the death between
a bunch of teenagers.
Harsh, right?
So how can you make people see that in a good
light?
Well, spinning it like a fancy high-stakes
reality show, and giving your lucky contestants
celebrity status, could go a long way.
You’re not really hiding the facts - everyone
knows that there will only be one survivor - but
you can sway people’s opinions by framing
it in a positive way.
With enough spin and glitter, your audience
will cheer for the game, rather than be appalled
at the brutality.
But even this kind of spin doctoring is pretty
blatant compared to some of the most effective
untruths that we see every day.
Let’s imagine that my friend Haymitch doesn’t
drink alcohol.
If I say “Haymitch isn’t drunk today”,
well that’s true - he never is.
But does it seem like that’s what I’m
trying to say?
Or do you get the impression that his
soberness is somehow an exception?
After all, why did I say “Haymitch isn’t
drunk today”, if what I meant to say was
“Haymitch is never drunk”?
The reason you’re probably reading more
into it than I explicitly said is grounded
in conversational rules that we’ve discussed
before, the maxims developed by Paul Grice.
The Maxim of Quality assumes that someone
who wants to have a cooperative conversation
will try to tell the truth.
A sentence like “Haymitch isn’t drunk
today” plays fast and loose with this maxim,
but it doesn’t technically break it.
I’m still saying something that I think
is true.
But I am breaking a different rule: the Maxim
of Quantity, which says “give all the information
that you need, and not more”.
So since “Haymitch is never drunk” is
a perfectly good sentence that I could have said,
which would have been the most concise and
transparent way to express that idea, I must
have had a reason not to say it.
By highlighting the fact that he’s not drunk
today, what I’m doing is implying that he’s
drunk at other times.
And since what I’m implying isn’t true,
this is called a false implicature.
So even though I only told the truth, a lie
about Haymitch’s drinking habits gets generated
by your brain and makes a home there.
You come away from the conversation with lies
in your head that I didn’t even say.
Tricky!
You can imagine that this is a pretty powerful
tool if you’re trying to convince someone
of something without getting in trouble.
Since they’re more discreet than flat-out
lies, false implicatures show up a lot, everywhere
from everyday conversation to advertising
to political speeches.
And since they play on assumptions that people
have about the way that we talk to each other,
they’re a very effective propaganda tool.
Another way to get people thinking what you
want also cashes in on universals about how
our brains work.
Everyone has inherent biases programmed into
them, from society, experience, family, media
- just living in the world.
That’s because our brains categorize information
into generalities, all the time, just to make
sense of all the impressions that come our
way every day.
This leads to what’s called implicit bias:
assumptions about the way things are that
we might not even be consciously aware of.
For example, if in your society being good
at math is often considered a sign of someone
being smart, then you might end up assuming
that anyone who’s smart is also good at math.
In reality, there’s all kinds of different
ways to be smart, but your brain uses math
as a convenient shorthand, saving you the
time it would take to start off with every
new person you meet without any preconceived
assumptions.
And although this is really unfortunate for
human interactions, because your brain is constantly
pre-judging people, it’s the same mental
reflex that does things like tell you that
that fruit growing on a tree is probably okay
to eat, or that a growling dog could be bad news.
Our instincts have an evolutionary basis - they just
don’t work out so well when dealing with other humans.
Unfortunately, because brains are really predictable
in this way, capitalizing on people’s inherent
biases is another common way for propaganda
to try and mess with us.
Let’s say the country you live in is divided
into a bunch of isolated districts, with very
little communication between them.
And let’s say that you, the corrupt leader of
this country, want to keep it that way, keep
them separated, to make sure that the masses don’t
rise up against you.
How do you pull that off?
Well, one way you could do it is make them
compete for resources and recognition in a
fight to the death.
Or you could take it down a notch, and play
them off each other psychologically, by drawing
on something called the Fundamental Attribution
Error.
You can think about it this way: if your friend’s
short-tempered with you, you might assume that
they’re having a bad day, but if a stranger’s
short-tempered with you, you might end up
thinking that they're just a mean person.
Fundamental Attribution Error happens when
you ascribe someone’s behaviour to their
internal nature, rather than their external
circumstances.
And it’s a lot easier to fall into that
cognitive trap when you’re dealing with
someone that you don’t really know.
You can probably already see how this could
be used for propaganda.
Let’s go back to that hypothetical corrupt
leader who wants to keep the districts fighting
against each other rather than united against
him.
Now, each community has a wide variety of
people in it, with complex dreams and personalities
and skillsets.
But if wealth and labour are divided up unevenly,
and certain districts are said to have certain
traits, then you’re setting the stage for Fundamental
Attribution Error to rear its ugly head.
If your rich-district buddy falls on hard
times, you’ll be much more likely to blame
the circumstances rather than the person.
But if you hear your peers talking about an
impoverished district, you might easily assume
that a lack of ambition or ability is responsible
for their hardships.
And the problem gets even more pronounced if
you prevent people from different districts
to interact: Instead of getting to know each
other as complex human beings, all you have
to work from is your implicit bias.
And it’s hard to challenge your biases when
they’re being reinforced by a propaganda machine.
There’s one more way that people can manipulate
information to specific ends.
Instead of making stuff up or capitalizing
on preconceived ideas, you could
just selectively omit information from existing
material, or prevent unwanted content from
being released.
When this is done by a government, it’s
called censorship.
Censorship is one of those words that gets
used in a lot of different ways to mean a
lot of different things, so we need to be
clear about what it isn’t.
Censorship shouldn’t be confused with boycott
or objection; there’s a difference between
an idea not being given a podium and an idea
being banned.
So it’s important to remember that rules
protecting freedom of speech apply equally
to someone who wants to express an idea and
someone who thinks that that idea is objectionable,
and speaks out against it.
The right to speech doesn’t mean the right
to an audience; anyone is allowed to talk,
but that doesn’t oblige anyone to listen.
What censorship is, then, is the selective
banning or removal of content based on some
criteria, and although a lot of governments,
if they want to be objective, have rules preventing
them from censoring stuff, that isn’t always
the case.
So let’s say that you’ve managed to keep your
different districts isolated and distracted
by a bloodthirsty annual battle royale.
Censorship can have the same aims as propaganda,
like political gain or manipulation, but a
lot of the time it’s actually presented
as a way to protect people, especially youth,
from objectionable ideas.
Even in places where freedom of expression
is a big deal, there’s often rules against
hate speech, for example, and governments
who otherwise might not get involved will
censor content that incites hatred and violence.
Some places also choose to shield young people
against content that’s sexual or profane,
like beeping out swear words on TV or the
radio.
Either way, though, censorship implies someone
with authority imposing their judgment of
what’s acceptable onto other people.
So there are lots of ways we can get messages
out there, some more honest than others.
Even if the odds are stacked in your favour,
you never want to have to take your eye off the facts.
So, we’ve reached the end of The Ling Space
for this week.
If you didn’t censor our video, you learned
that propaganda can present as lies, like
“fake news”; that false implicatures can
plant untruths into people’s heads while
only saying true things; that propaganda taps
into the biases that we all carry, often using
the Fundamental Attribution Error; and that
censorship happens when governments prevent
content from being released, often in the
aim of protecting youth.
The Ling Space is made by all these amazing
people over here.
If you want to learn more about other ways
language can mislead, check back on our website!
And while you’re there, why not check out
our store?
We’re also on Tumblr, Twitter, and Facebook,
and if you want to keep expanding your own
personal Ling Space, please subscribe.
See you next time! Daraa uulzii!
