MALE SPEAKER:
Welcome, everybody,
to yet another
Authors@Google Talk.
Today with us is best-selling
New York Times author Peter
Singer.
He wrote a very important
book on cyberwar.
Our own Eric Schmidt says,
"this is an essential read."
My dear colleagues, I
will repeat this for you.
This is an essential read.
So we do have copies
of the book on sale.
This is the way it looks like.
Please go and buy the book,
even if you've seen this talk.
Peter is a great author
in many respects.
But what he did for
cybersecurity and cyberwar
for, in particular,
is he expanded
on a field that is growing.
And that we know is becoming
increasingly important not only
from the infrastructure
standpoint,
but also for
international relations.
I'm going to let Peter
actually take it from here.
Thank you, Peter.
PETER SINGER: Thank you
for the kind introduction.
So it's a little
bit daunting to be
talking on this topic
at a company like this,
because I remember
the very first time
that I ever saw a computer.
My father took me to a science
center down in North Carolina.
And I got to see a Commodore,
if you remember those.
And I took a class
on how to program,
learning an entire new
language for the sole purpose
of making a smiley face out of
the letter m that printed out
on one of those
old spool printers
that you tore the perforated
paper off the sides.
Remember that?
Now since then, the centrality
of computers to my life,
your life, the
entire world, it's
almost impossible to fathom.
We live in a world where
more than 40 trillion emails
are sent every single year.
The first website
was made in 1991.
Now, according to
your own analytics,
there's more than 30 trillion
individual web pages out there.
Moreover, the
internet is not just
about compiling and
sharing information.
It's also having impact out on
the real world via the emerging
Internet of Things.
According to Cisco, we'll see
more than 40 billion devices
internet-enabled over
the next five years,
as everything from thermostats
to cars to refrigerators
to technologies literally not
yet invented or imagined all
come online and
all start to carry
on conversations without us.
So in short, domains that range
from communication to commerce
to critical infrastructure
to even conflict.
98% of US military
communications
goes over the civilian-owned
and operated internet.
All of these spaces
are dependent on it.
So we're in an age
of cyber dependency.
But in the short
history of the internet,
I would argue that we've reached
a critical turning point.
And it's because while the
positive side of cyberspace
is rippling out, so too are
the risks, the negative side.
There's all sorts of ways
you can illustrate this.
You can illustrate it
with the raw numbers.
Every second, nine new pieces
of malware are discovered.
97% of Fortune 500 companies
know that they've been hacked.
And the other 3% have
been, too, they just
aren't willing to
admit it to themselves.
More than 100 governments
have created some kind
of cyber military command, some
kind of military unit designed
to fight and win wars in
cyberspace and beyond.
And indeed, the very first
Pew poll to kick off 2014
found that Americans are
more afraid of a cyber attack
than they are of North
Korean nuclear weapons,
Iranian nuclear weapons,
the rise of China, Russia,
or climate change.
So these fears,
they've coalesced
into one of the most
rapidly growing industries
in the entire world.
They've also driven a
massive bureaucratic growth
at the national
governmental level
not just in the United States.
Just earlier today,
France announced
that it was spending
another $2 billion
in its military on cybersecurity
issues and cyberwar.
But also we see it
at the state level.
And even at the
local level, where
you see cities like Los
Angeles, for example,
creating cybersecurity centers.
What all this together means
is that for all the hope
and promise of the
new digital age,
we also live in an era
of cyber insecurity,
if we're really being
honest about it.
And so before I go
much further, it's
at this point I'm
going to try and do
something that's a little
bit counterintuitive,
but will maybe help make that
point about cyber insecurity.
And a lot like the
challenge of trying
to write a book
about cybersecurity
and make it
interesting, you also
have the challenge of how
do you give a talk about it
and give visuals that
make it interesting.
So what I did-- and
with Boris's help,
hopefully it will
play for us here
--is I've assembled
what I think are
some of the best
illustrations of cyberwar art,
and some of the worst
illustrations of it.
And it's going to
play in front of me.
I'm not going to speak to it.
It's just going to continue
to flash for a couple reasons.
One, to tell that story
of cyber insecurity.
But also because data
has found that you're
60% more likely
to retain what I'm
saying if you look at a picture.
Even if the picture has nothing
to do with what I'm saying,
it's just the way
us humans work.
And that actually goes
to a broader lesson
that the book explores, and
we'll talk about it later on.
Which is that we're humans,
we're strange, we're weird,
but that's what drives
all of these things.
So let's pull back on
all this and wrestle
with the question of why a book
on cybersecurity and cyberwar,
and why now?
There's two quotes
that motivated
me that basically
encapsulate this.
The first is from
President Obama,
who declared that
cybersecurity risks pose quote,
"The most serious economic
and national security
challenges of the 21st century."
The second quote is from
the former CIA director,
who said quote, "Rarely has
something been so important
and so talked about with less
and less clarity and less
apparent understanding."
And you can see, I really
do want to talk to this one,
but we'll keep moving on.
So let's explore this gap.
We see it in all
sorts of fields.
From the 70% of
business executives--
not 70% of CTOs,
CSOs, CIOs --but 70%
of business executives in
general, in any industry, who
have made a cybersecurity
decision for their company
despite the fact that no
major MBA program teaches it
as part of your normal
business management
training and responsibility.
That same kind of
gap in training
happens at the schools we teach
our diplomats, our lawyers,
our journalists, our generals.
Or anecdotes.
And there's just
an array of funny,
but in a certain way
sad, anecdotes that
populate the book.
From the opening of the book
where a Pentagon official is
telling us how
important this all is,
but he describes it
as "this cyber stuff."
When you can only
call something stuff,
but you know it's
important, that's
not a good place to be in.
Or the former Secretary
of Homeland Security,
the agency that is ostensibly
in charge of cybersecurity
on the civilian side
for the United States--
who has actually now taken over
as Chancellor of the university
system out here in California
--who proudly talked
to us about the fact that
she doesn't use email.
And in fact hasn't used social
media for over a decade.
Not because she doesn't
think it's secure,
but because she just
doesn't think it's useful.
That same phenomena is happening
on the Judicial Branch.
Where, for example, a
Supreme Court Justice
talked about how they
quote, "Hadn't yet
gotten around to email."
Now this is obviously
worrisome to folks
here working on
the Gmail account.
But there's a broader
question of what
does this mean for Justices
that in the upcoming year
are going to decide everything
from maybe net neutrality
questions to the legalities
of some of the things
that the NSA was doing when they
just haven't yet gotten around
to email.
The cyber stuff problem is not
just an American phenomenon.
We saw the same
thing in meetings
with leaders in China,
UAE, France, Great Britain.
The Head of Cybersecurity
in Australia
had never heard
of Tor, obviously
a critical technology
in this space.
Now the result is
that cybersecurity
is as crucial to areas as
intimate as your personal
privacy, to the security
of your bank account,
to as weighty as the future
of world politics itself.
But it's been
treated as an issue
only for the "it"
crowd, for the IT folks.
In turn, the technical
community that
understands the workings of
the software and the hardware
hasn't dealt very
well with the wetware,
with the human side,
and particularly
the ripple effects of
this into other worlds,
be it policy, law,
war, you name it.
They've often
looked at the world
through a very specific
lens and failed
to appreciate some of the
broader pictures out there.
Now the dangers of
this are diverse.
Each of us, in whatever
role we play in life,
must make decisions
about cybersecurity
that shape the future
of the world well
beyond just the online world.
But too often we do so
without the proper tools.
Basic terms and
essential definitions
that define both what's
possible but also what's proper,
what's right and wrong, are
missed or even worse distorted.
Past myth and future hype
often weave together,
obscuring what actually happened
with where we really are now.
And so the result is that
some threats are overblown
and overreacted to, and
other threats are ignored.
So for example, as
someone who loves history,
it absolutely pains
me when I hear
people-- and people
who have done
this range from senior
government leaders
like senators to generals
to prominent news columnists
--describe how we are in a
parallel to the Cold War.
Or as a cabinet
official told me,
that malware was
"just like a WMD."
And that's why we
needed to approach it
in the same kind of
deterrence theory
that we used in the Cold War.
What these people
fail to appreciate
is the parallel to
the Cold War is not
the one they think
they're making.
If you understand
both the historic side
and the technical side, the
best parallel to the Cold
War-- actually,
those early days we
didn't understand well either
the technology but even more so
the political dynamics that it
was driving, the period of time
where we took the real life
versions of Dr. Strangelove
seriously.
So as an illustration
in the book,
we explore the episode where
the US Air Force actually
had a serious plan
to nuke the moon
to show the Soviets
that we could
do interesting
stuff in space, too.
Those are not historic lessons
we should be drawing in terms
of the how-to's.
But that's often what
the discourse is.
Let me go into some
of the manifestations
of this disconnect, and how they
play out, and why they matter.
One in particular is that we
often lump things together
that are unlike, simply because
they involve zeros and ones.
So take that idea
of a cyber attack.
General Alexander,
who is simultaneously
the commander of US military
Cyber Command and double-hatted
as the head of the
NSA-- which there
are some very interesting
problems with that.
But let's move beyond that.
He testified to Congress,
quote, "Every day,
America's armed forces face
millions of cyber attacks."
But to get those
numbers he was combining
a variety of like
and unlike things.
He was combining everything from
probes and address scans that
never entered networks
to unsuccessful attempts
to get in that ranged from kids
carrying out pranks to attempts
at political protests to
attempts to get in to carry out
some kind of theft
or active espionage.
But none of those millions of
attacks was what his listeners
in Congress thought
he was talking about,
which was the so-called
cyber-Pearl Harbor
or cyber-9/11 that actually
there's been over a half
million media and government
speech references to.
And that's what his boss
as Secretary of Defense
was warning everyone about.
Essentially what we're doing
is that we're bundling together
all of these activities simply
because they involve software.
Which would be a lot
like bundling together
the activities of a group of
teenagers with firecrackers,
a group of political protesters
in the street with a smoke
bomb, James Bond with
his Walther PPK missile,
a terrorist with
a roadside bomb,
and a Russian cruise missile,
and saying these are all
the same because they involve
the chemistry of gunpowder.
We've bundled them together
on the digital side,
because they all
involve the internet.
Or take the organizations.
I had a senior US
military official
argue with me that Anonymous and
Al Qaeda were the same thing.
Now, however you come down on
Anonymous-- and I'm actually,
I guess far more
empathetic towards them
than what you'd expect
from people coming from DC.
But the bottom line is,
wherever you come down on them,
they have nothing to do
with Al Qaeda in terms
of their organization,
their means, their ends,
their causes-- basically the
only thing they're related
is they're both non-state actors
that begin with the letter A.
But that was the belief.
Now these gaps in
understanding, these disconnects
of policy and reality, mean
that we're not only seeing
growing tension--
and we explore this
in particular in meetings
with US and Chinese officials
who would be negotiating on
core questions of cybersecurity.
And yet, as an illustration,
one State Department official
going off to one of these
negotiations actually
asked us what an ISP was?
Which to make that
Cold War parallel,
would be like going off to
negotiate with the Soviets
and not knowing what an ICBM is.
But the point is, it's
not only driving tension,
it's leading to us being
taken advantage of.
And that can happen at
the individual level
when you get tricked to send
your mom your bank account
information because
she's stuck in Thailand.
You didn't know she
was in Thailand,
but gosh, you just
need to help her out.
To more serious
illustrations of this.
Like at the G-20 conference,
the most important
international
conference of the year,
diplomats were spearphished
by-- they received an email that
had a wonderful offer for them.
It said, if you
click this link you
will be able to see nude photos
of the French First Lady.
And many of these senior
diplomats clicked the link.
And unfortunately they didn't
get to see the nude photos,
but they did download
spyware onto their accounts.
Again, senior
government officials
to being taken advantage
of at the business
organizational level.
Either alternatively not doing
enough to protect the business
or hiring hucksters
who offer 100% security
with some kind of
silver bullet solution.
Or frankly, being
taken advantage
of at the national
political level.
Which is, I think, behind
a number of the issues
surrounding the current
Snowden-NSA scandal.
This can even happen
to a president.
Reportedly, Obama
expressed his, quote,
"frustration that the
complexity of the technology
was overwhelming policymakers."
Now, our inability to have
a proper discussion on these
means that we see a
distortion of threats.
And in turn, a misapplication
of resources to face them.
Perhaps the best illustration
of this is a number-- 31,300.
That's the number of news
and academic journal articles
that have explored the
phenomenon of cyberterrorism.
Zero.
That's the number of
people that have actually
been hurt or killed by an actual
incident of cyberterrorism.
In the book, we joke
that in many ways
cyberterrorism is a lot like
Discovery Channel's Shark
Week, where we obsess
about the danger of sharks
even though you're 15,000
times more likely to be
hurt on your toilet.
Except the difference is that
Jaws actually did get someone,
or the real world version
of Jaws did get someone.
Whereas we've not seen
this in reality yet
other than Die Hard 4.
Now let me be clear,
I'm not saying
that terrorists don't
use the internet.
And in the book we
have several chapters
that explore terrorists'
use of it, much of which
is like how the
rest of us use it.
And I'm not saying that there
is not interest in carrying out
acts of cyberterrorism,
nor that there
wouldn't be impactful
effects of them.
Indeed, our
development of Stuxnet,
a cyber weapon that finally
had physical powers, caused
physical damage to the world,
is a great illustration of this.
But in turn, Stuxnet illustrates
how an effective cyber attack
that is real and consequential
is also quite difficult.
To put it a different way, when
it comes to cyberterrorism Al
Qaeda would like to, but can't.
China could, but
doesn't want to yet.
Now my point, rather,
is that strategy--
whether it's at the national
level, at the business level,
at the individual
level, strategy
is about choices and priorities.
And so we need to weigh
the centrality of what
we talk about, what we obsess
about in our discussions
versus what are arguably
not only very real, but more
consequential cyber
threats out there.
It ranges from something
that this organization
is very familiar with--
the massive campaign
of intellectual property
theft that by most measures
you could judge to be
the largest theft in all
of human history, that's
ongoing right now.
And where is it coming from?
If this was a
Harry Potter novel,
we would describe it
as a large Asian power
that shall not be named.
To if we want to think
about the national security
consequences-- not just looking
at the consequences of that IP
theft and how it
plays out, but look
beyond the sexy cyber-Pearl
Harbor descriptions
and actually focus on how the
military uses this technology
and wants to use it.
And what is the future of
computer network operations
in actual campaigns of warfare?
To maybe moreso we should
be paying attention
to the ripple effects, the
secondary effects of all
these actions.
Because if we use the
illustration of terrorism,
one of the things
we've learned from 9/11
is it's not merely
the attack itself,
but how we react
to it that really
stakes its place in history.
And so I worry about some
of these secondary effects
that are playing
out, and particularly
how they are hammering away
at that crucial value that
has basically underpinned
the internet of trust.
And we can see
that being damaged
by the massive campaigns
of cyber crime out there.
Whether it's the IP theft
to credit card, and like.
And that's affecting both trust
that consumers and users have
with the network, and in
turn what the operators have
towards consumers.
To trust damaged by our
government's actions seeking
to deal with
conventional terrorism.
And what that has done to
both trust in those agencies,
but also trust in America and
trust in American technology
companies.
To finally, what it's done to
the internet freedom agenda.
And the trust in the
underlying governance structure
of the internet that has
worked so effectively
for our lifetime,
created this thing
that's been arguably
the most powerful force
for political,
economic, social change
certainly in my
lifetime, maybe ever.
And yet over the next year
could be seriously damaged
by some international
negotiations that
are playing out,
particularly pushed
by authoritarian states
like Russia and China.
If you like the idea of Russia's
82,000 blacklisted websites,
or if you like the building
internet wall in China,
this may be the future
if we don't watch out.
Particularly as some
of the core swing
states, the Brazils, the
Indias, the Germanys,
may not be with us the
way they were previously.
Now this gap in the
fields also means
when it comes to
the warfare side,
we act on bad assumptions.
Or don't make connections
across domains
in ways that truly matter.
So take the notion of
something from the field
of war applied here,
which is offense, defense,
the balance between these.
There is an idea that's taken
hold that cyber offense is
inherently privileged.
It's inherently dominant
against the defense.
And not just now, but as
one US military report
put it, quote, "For the
foreseeable future."
So for as long as we can see
in the future cyber offense
will be dominant, is the
assumption that's out there.
This in turn has
driven the US military
to spend roughly
four times as much
on cyber offense
research and development
as it has on cyber defense
research and development.
Now the problem with
this is threefold.
The first is that it
cyber offense is not
as easy as it's
too often depicted.
So for example, the former
number two in the Pentagon
described how, quote, "A
couple of teenagers sitting
in their parents'
basement, sipping Red Bull
and wearing flip-flops, could
carry out a WMD-style attack."
No.
They couldn't.
They could do a lot of
things, but not what he's
for portraying.
And Stuxnet is a great
illustration of that.
In terms of the wide
variety of skill sets
that were involved in this,
everything from intelligence
analysts and collection to some
of the top technical talent
in the world from multiple
nations, to nuclear physicists,
to engineers, to then another
espionage effort to get it back
in.
It was a Manhattan
Project-style effort.
Again, the barriers to entry
are lowering, but it's not just,
oh I need a teenager
and some Red Bull
and I can carry this out.
The second is history
is replete with examples
that every time a military
assumed the offense was
inherently dominant, that it
turned out to be the opposite.
And we're on the
100 year anniversary
of probably the best
illustration of that.
Where the nations of Europe,
prior to World War I,
all assumed that the new
technologies of the day
meant that the offense
was advantaged.
And in fact, it
was so advantaged
that you couldn't allow yourself
to be stuck on the defensive.
So you had to go to war
before the other guy could.
So that you wouldn't be
caught at a disadvantage.
And as we saw play
out in World War I,
actually it was the defense
that turned out to be dominant.
But the final
issue with this is,
even if it's true
it doesn't actually
mean that we should be
acting the way we are.
To give a metaphor, the idea
of sitting in your glass house
and looking around
and saying, gosh
I'm worried about all these
roving gangs of teens.
Well, my best answer is to
buy a stone sharpening kit.
That's not the logic that
we should be following,
but that's what we're
doing right now.
So what can we do instead?
The last third of the
book is all about these
what can we do questions,
everything from global level
responses to national level
down to corporate to you
and I. How can we
protect ourselves
and the broader internet itself.
I'm not going to
try and summarize
that 100 pages up here.
So I'll just hit on five themes
that cut through all of it.
The first theme is
knowledge matters.
It is vital that we demystify
this realm if we ever
want to get anything done
effective in securing it.
We have to move past
the situation now
where, for example, a White
House official described this
as quote, "only
understood by the nerds."
Or when the President
himself received a briefing
on cyber security questions.
And at the end of the briefing,
reportedly asked for, repeated
back, quote, "This
time in English."
That's not to beat up on the
residents of the White House.
That would happen in almost
any major company that's
not in this space, not in
Silicon Valley, but also
even small companies,
a cupcake stand.
It would happen at
the White House.
It would happen at my house.
The second theme
leads from this.
It's that people matter.
Cybersecurity is one of those
wicked problem areas that's
rife with complexities
and trade-offs.
And this is in large part not
because of the technical side,
which often gets too much focus,
but rather the people part.
Now it's useful from a
writer's perspective,
because that gives you all
the fun characters and stories
to populate.
My favorite being the time
that Pakistan accidentally
kidnapped all the world's
cute cat videos for a day.
But it also means
that if you want
to set up best responses at the
global level, business level,
all the way down to
the individual level,
you need to recognize that
the people behind the machines
are both part of
every single problem.
And have to be part of
every single solution.
This leads to the next theme.
Incentives matter.
If you want to understand why
something is or isn't happening
in cybersecurity, look
to the motivations,
the relative costs,
the organizations
that people are in, the
tensions at play between them.
There is a reason why, for
example, finance companies
are doing better at
their cyber security--
both in terms of defending
themselves, but also sharing
information --versus
how, for example,
critical infrastructure and
natural gas or the power grid,
how they're not cooperating and
not defending themselves well.
It's because they're
incentivized both to directly
understand the cost,
but also there's
a regulatory environment around
them that's driving that.
And this points to the
role that government
can, and frankly
should, be playing.
And everything from being
a trusted information
provider to setting standards
to-- in other situations,
it's going to have to create
market incentives, which
is another way of
saying regulation.
The fourth is history matters.
There is a history to how we
got here with the internet.
And too often it's ignored.
And that's when you hear
these sort of silly things
like oh, well let's just build
a new, more secure internet,
which is not a workable concept.
And yet it's gotten a lot of
credence in policy circles.
But more broadly, it
means that there's
a wealth of lessons to learn
from history and other fields.
So if we're exploring,
for example,
how to deal with cyber crime,
but also patriotic hacker
communities that are
linked to states,
we look at the age of
sail as a parallel.
Where you have this
domain in which
commerce, communication,
and conflict all
played out on the open sea.
The conflict actors ranged
from state militaries
to individual criminal
groups, pirates,
to these fuzzy things in
the middle, privateers,
that sort of gave you some
of the advantage of pirates
but also state-linked as well.
And that's a lesson
that we can look
to in how we went
after that trade.
To if we want to understand
good role for government, let's
look at the most successful
government agencies in history.
Like the Centers for Disease
Control, which literally
started with a
couple of scientists
taking a $10 collection,
a tin cup for $10.
And that agency went on to
eradicate malaria inside United
States, to fight smallpox on
an international level, to oh,
by the way, served as a critical
back channel to the Soviets
in the worst part
of the Cold War.
This leads to the final lesson,
and it comes from the saying
that Ben Franklin had, that
"An ounce of prevention
is worth a pound of cure."
What's fascinating is that
the CDC did studies and proved
that Franklin's saying actually
is true in public health.
It's also true in
cybersecurity and cyberwar.
Very simple steps
of cyber hygiene
would have an immense impact.
Indeed, one study of
the top 20 controls
found that they would stop
94% of all cyber attacks.
Now some people react to
that, and they go well,
I'm really special.
I'm in the 6%.
Well, statistically we
all can't be in the 6%.
But even more so
they should talk
to their technical
folks, their IT crowd.
And they would quickly
learn how if they didn't
have to spend so much time
dealing with the low level
stuff, they could actually
focus on the more advanced
persistent threats
that are out there.
And a large part of
this, what's interesting
is the data shows that senior
executives are actually
twice as likely to be
behind one of these problems
as junior folks, which makes it
even more difficult for the IT
department to deal with.
To give some
illustrations of this--
let me add one more thing on it.
The other challenge to this is
that there's this assumption
that the advanced
threats are all
using very advanced pathways in.
And yet consistently,
they're coming
in through rather
simple approaches.
For example, the most
important outside penetration
of US military classified
networks by a foreign espionage
agency happened
when they conducted
what's known as a candy drop.
Basically they
dropped memory sticks
in a parking lot outside
a US military base.
And while we learn in
preschool don't take candy
from strangers, a US soldier
saw the shiny memory stick
in the dirt.
Thought this was
really cool, picked
it up, wanted to
see what was on it.
So he took it inside
the base and plugged it
into his computer.
And that was actually the
most important penetration
of US military networks
from the outside.
To the insider threat, the
episodes of Manning or Snowden.
Again, wherever you
come down on them,
we can all agree that the
organizations were not
following the kind of
internal security norms
that a cupcake
store should have.
Monitoring, for example,
massively anomalous traffic,
things like that.
Now this idea of hygiene
is important-- again,
when I say hygiene,
picking up a memory stick
that you found in the dirt.
That's basic hygiene,
that's the five second rule,
let alone cyber hygiene.
But this idea of hygiene, I
think, is important not just
because of that idea of
prevention, but even more so
the ethic behind it.
That we need--
again, whether we're
talking about this on a global
level, a national level,
a business level, to
an individual level.
I teach my kids hygiene.
Wash your hands, cover
your mouth when you cough.
I teach them that not only
to protect themselves,
but also that they
have a responsibility
to protect all that they
connect with through the course
of their day.
That's the same kind
of ethic that we
need in the online space.
And we should be
pushing more of that
rather than the fears that
are out there driving us.
So to bring this
story full circle,
in the beginning I talked about
how when I was seven years old
I saw my first computer.
Now if you had told
little seven-year-old me
that one day this Commodore
or its descendants
would allow someone
to steal your money,
steal your identity, even become
a weapon of mass disruption,
I would've begged and
pleaded with my dad
not to turn on the power button.
Don't let us go into this
dangerous, scary world.
Today I wouldn't have
it any other way.
Because that technology
has given me and all of us
literally superpowers that
we didn't imagine back then.
We can ask any question and
Google the answer to it.
Any question, important
or not important.
Yesterday I was looking
up the backstory
of a minor noble in
the "Game of Thrones."
That's actually the
important example.
This technology has
given us the power
to become friends with people
that we've literally never met.
All of these great
steps forward.
And so the same as it
was back then, I think,
is the way it will
be in the future.
We have to accept and manage
the risks of this world--
whether it's the online world
or the real world, so to speak,
--because of all that
can be achieved in it.
And to steal the
title from the book,
in the end, that's really
what everyone needs to know.
Thank you.
[APPLAUSE]
MALE SPEAKER: We'll
do a short Q&A. Please
wait for the audience
mic to arrive to you.
And I wanted to mention one more
thing that Peter told me about.
And this is, there's a website.
It's called
cybersecuritybook.com.
And there's a cybersecurity
song playlist there.
I'm curious myself
now what that is.
Questions?
AUDIENCE: It seemed like
one of the big problems
you mentioned was a
problem of leadership.
And the people who
are empowered just
don't have the sophistication
to talk about these issues
and make decisions.
And I just was
wondering what you
thought was the minimum
level of competence
required by these people?
Because realistically, they
seem to be pretty entrenched.
And I don't think it's
realistic to expect
a whole new breed of
people to come in and make
these decisions.
And on that point, as
well, how likely do
think it is to be able to get
these people to that level
of sophistication, given the
fact that these people don't
know how to use email?
PETER SINGER: It's
a great question.
And one part of it,
sometimes people say well,
isn't this just
a digital native,
digital immigrant issue?
That digital immigrants,
someone who grew up in a world
without computers
and then now has
moved into this world versus
a native who was born into it
and it all seems
natural and intuitive.
And so this problem, won't
it just solve itself,
is how they sometimes
reference it.
First, there's a long period
of time before the immigrants,
so to speak, move out of
the positions of power.
To put it a different
way-- there's
a quote in the book
from a guy that
talks about how the folks that
are sitting in the big boy
chairs, is how he phrased it.
The big boy chairs in
government or at CEOs
of a lot of different companies
or the like, many of them
didn't see or use
their first computer
until they were in
their 30s or 40s.
But it doesn't mean
one, they're going
to be in those positions
for a long time.
And so we've got this
gap, this period of time.
We can't wait it out.
The second is a lot
of digital natives
don't have this intuitively
the way it's assumed.
In large part because of how
we've stovepiped these issues.
That's for the IT
folks to handle.
Or the IT folks saying, oh,
well that's for legal to handle,
that's not for us.
And so to your question,
what's the level of expertise.
I don't think there's a common
test that everyone has to pass,
or something like that.
I actually-- and this may be
a little bit controversial.
I don't think it's
even about people
knowing how to do things like
computer programming-- maybe
it's controversial in this room.
It's instead having familiarity
of the key concepts,
the key terms, so
that frankly, they
can have a good
argument about it.
You can see this in what's
playing out with the NSA issues
recently, where
both the mass media,
but also both sides in
Congress that are arguing it,
it's just so factually
disconnected.
And so they're not able to even
have a good argument about it.
To use that illustration
of offense, defense theory.
It's a great way
of showing this.
Where on one hand
the people that
understand the
technical side don't
know that there's actually
a very rich literature
in international relations
of offense, defense,
that doesn't lead you to
one conclusion or the other.
And they were sort of-- they
picked one part of it and said,
this is the conclusion
of what we should take.
In turn the IR crowd doesn't
understand this all that well.
The bigger thing is not a level
of knowledge, it's an attitude.
There's too much Ludditism
out there that's celebrated.
A senior government official
who held responsibility
for this literally
saying, doesn't
think it's all that useful.
And she did the same
thing the SecDef did,
where if their
email came in it's
printed out by the assistant.
They write their answer on it.
And then they hand it back.
You can't be effective if
that's the kind of attitude
that you have, both
for your internal
but you think it's OK to talk
to others about it that way.
And so for me it's,
again, there's
some base level of knowledge.
But it's more about changing
the attitudes around it.
And frankly stop looking at
this as just a highly technical
issue for, again, the IT
crowd, or for the nerds.
AUDIENCE: What
principles have we
learned from the behavior of
immune systems and biology,
and from the resilience
of biological networks
all the way from the
metabolic networks
up to ecologies, what principles
have we learned that we are not
yet applying in cybersecurity?
PETER SINGER: That's
actually a great question
to bridge back to
the prior question.
Because that all-important
word that you used,
resilience, is
what I think should
be at the centerpiece
of our approaches
and our discussions
and the like.
And you see this, again,
on the government side.
But also on the business side.
Basically there's this
mentality of offense, defense.
And defense, it's build
higher or thicker walls.
And then the offense
side is, weirdly enough,
coming back into
the private sector
with the emergence of the
potential hack back industry,
of oh, the best way to
protect yourself is not just
to build a high wall but we'll
go after the bad guys for you.
It's basically a business
version of vigilantism.
It has major concerns for
international relations,
because it could
quickly escalate things
in a way that's unplanned.
It's also a horrible business
model for the client.
Vigilantism only worked
for Charles Bronson.
This idea of the best way
to defend yourself-- I'm
going to go after this guy.
And then oh, you're attacking?
I'll go after this guy,
this guy, this guy.
And so at the end of
the day all you're
doing is paying someone to
go after others for you,
not actually making
yourself secure.
Instead of this
mentality, it goes
to what you asked
about, resilience.
And you can think about this
in the physiological way.
And that turns on everything
from the notion of it being
not a Cold War-- you
know, this idea of we're
in a new Cold War is
literally a quote from folks.
One, malware is not like the
physics of a nuclear weapon.
Second, there's not the bipolar
relationship of two powers.
The players in
cybersecurity are just
like the players in cyberspace.
It's everything from the
100 cyber military units
out there to
non-state collectives
interested in everything from
cute cats to online protest
to corporations that range
from Google to Target
to the cupcake store.
And so it doesn't fit that
to the online battle of ideas
is not the ideological Cold
War battle that it's framed.
The online battle of
ideas are-- go on YouTube
and you can see the
diversity of them.
And so instead it's this
ecosystem of players.
And then it goes to the idea
of the physiological approach
of your own body.
Our bodies are probably the most
resilient thing ever created.
They're designed for a world
that's incredibly hostile.
They expect that bad
things are going to happen.
They have a really
great exterior line
of defense, your skin.
But they fully plan that
that skin, at some point,
will definitely be penetrated.
And it has all sorts of
systems to react to that.
Everything from stemming the
flow to monitoring infection,
internal monitoring systems,
to your body triages
between what's
important what's not
to-- guess what,
your body itself
operates on the assumption that
something external is already
inside.
There's 10 times
as much-- when you
look at the number
of cell counts,
there's 10 times
as much bacteria
and the like in your body
than there are human cells.
And again compare that to the
typical oh, just buy my widget,
or if I have a better, stronger
password I'll keep them out.
But there's another
idea of resilience
that I don't think we
pay enough attention to.
And that's psychological
resilience.
There's 3,000 books on
psychological resilience
of some sort.
Resilience in your job,
resilience in your love life,
et cetera.
And it's all built
around the idea
that you can't go
through life thinking
that bad things
will never happen,
or they can all be
deterred or defeated.
Instead, your success is
dependent on your assumption
that bad things will play out.
But it's all about how will
you power through them?
How will you recover
quickly from them?
How will you not allow them
to knock you down in the way
that they could?
All these different
way-- and again, you
can think about in your love
life to your job, whatever.
We need that same mentality
when it comes to cyber.
So take cyberterrorism,
the central discussion
of, oh my god, the power
grid might go down.
And in fact, you've seen all
of these false news reports
about times that cyber
attacks caused it.
Which either in one situation
the power didn't go out,
it's a false story
that-- guess what,
"60 Minutes"
unsurprisingly covered.
To another situation, things
that are described as cyber
attacks that they're not.
So two dudes with a rifle,
that's not a cyber attack.
But that was recently in
the news covered as in this.
The bottom line is that
squirrels have taken down
more power grids than the
zero times that hackers have.
Again, it could play out.
But it's all about how
will we react to it.
Where I live outside
Washington, the power
went down multiple
times this summer.
But if it had been a cyber
attack that caused it,
we would have had a
congressional commission
investigating who to blame.
And we would have had
mass hysteria around it.
And so what I would prefer-- and
I go back to that echo of 9/11
and how you react --is the
British mentality to terrorism,
keep calm and carry on.
Rather than the
American model, which is
we try and out-escalate the
hype and the fear around it.
Because we're seeing more
gains in the fear and hype.
And my worry is that's
carrying over to cyber side.
AUDIENCE: The cyber crime that
really matters in the Snowden
story is not what Snowden
did, but what he revealed.
Alexander has two mandates,
both offense and defense.
And as we've seen
and as you've said,
the offense has dominated
in his activities.
But whether offense
inevitably dominates,
as they seem to think and as is
premised behind their actions,
they've loaded the dice.
Part of what they've
done is rather than
also act on their
defensive mandate,
they have purposely gone out
and inserted vulnerabilities,
worked with vendors
of security software
to purposely insert
vulnerabilities,
making us more vulnerable.
And as you said,
incentives matter.
Let's take a look-- in
the absence of Snowden,
let's take a look
at the incentives
on secret intelligence
agencies themselves.
What is the bureaucratic reward
for successfully carrying out
an attack?
And what is the
bureaucratic reward
for successfully preventing
attacks that aren't visible
because they couldn't happen?
The second is invisible.
The bureaucratic
reward structure
has no means to reward it.
And in the absence of Snowden,
the first is cost-free.
PETER SINGER: I'm in agreement
with you on a couple of areas.
One, on the notion
of incentives.
And again, you can see
that whether you're
talking about within that
intelligence agency to why
we see on the defense side
certain industries cooperate
or not.
And it all turns on that.
But then there's the
broader-- essentially
you pulled the
bandage of Snowden.
And so we've got to go at it.
And you began by hitting
one part of his activity.
And I think this is
the challenge right now
in the discussion and
debate around him,
the NSA, is he a traitor, is
he a whistleblower, should he
get clemency or the like,
is that essentially he
gathered and now is being
released-- actually not by him.
This is one of the
myths that's out there.
It's not him pulling
the strings right now.
The journalists, they're
actually going through it.
And the challenge for them
is because there's so much,
it actually involves,
again, a very different set
of expertise.
So someone who understands
the technology will not
get-- they'll see a name pop
up that, say, the Latin America
beat reporter will go whoa,
whoa, whoa, that name.
That guy's now the Deputy
Foreign Minister of Brazil.
That name is meaningless
to the person
who knows what this acronym
means that the Latin America
reporter doesn't know that.
And then in turn you need
the spy and the like.
So they're actually having
these teams go through it
and figuring out what's
newsworthy or not.
But the bottom line is there's
such a mass of information
and the wide variety of
stories that have come out
and will continue to come
out is that it essentially
falls into three very
different buckets of activity
that has been disclosed.
The first bucket of
activity is frankly
what I would describe as
smart strategic espionage
against American enemies.
And you hit sort
of the mentality
that drive some of that.
Now there's an issue
of-- you said they
when you're talking about
NSA versus cyber command.
And they're the top military
intelligence agency.
But the bottom line is
one bucket of activity
was things that we would expect
and want an agency to do.
Going after monitoring
terror rings in Pakistan,
Iranian nuclear research,
China, et cetera.
Bucket number two is what
I would term questionable.
Activities that there
is a debate around
because it involves US citizens
in some way, shape, or form.
Either through legal approaches
on the front door to back door
to running with an
authorization in a way
that the policymaker that
authorized it didn't understand
what was authorized
to essentially deals
made with foreign
intelligence agencies
where they were able to
collect things in a way
that we couldn't, an exchange
of information and the like.
But basically the debate around
involvement of US citizens.
Category three is
the bucket that I
would describe as unstrategic,
or more directly stupid.
And that is targeting
of close American allies
and American
technology companies.
And the resonance of
that is everything
from how I mentioned
the hammering
to other kinds of
international negotiations that
may matter more to
as you mentioned,
the undermining of
cybersecurity for all of us.
Particularly based
on this assumption
that they were the
only ones smart enough
to find the vulnerability,
but then more broadly what
it's done to that
critical word, trust,
trust in American
technology companies.
And the resonance
of that, at least
according to
Forrester Research, is
that your industry will lose
approximately $180 billion
worth of revenue.
That's why people
here are pissed.
The problem, though, is
that in the debate around it
we pull from whichever
bucket we care most about.
So if you care most about
classic national security,
you go, this guy disclosed
things that are important.
He is a traitor,
dada, dada, dada.
If you care about the privacy
Fourth Amendment questions,
you only talk about
those, and he's
a whistleblower, and
clemency, and the like.
We see it also in
how we defend it
from the narrative on
the government side.
So these kind of activities
are to prevent another 9/11.
Which may describe bucket two
and the metadata and the like,
but that doesn't make
the Germans feel better
about why you were going
after Angela Merkel's
messages, or the like.
And so the problem is it's
all of these things at once.
And it's muddied the
water of the discussions.
And we can even see
this most recently
in the President's speech,
which, again, focused primarily
on one of the buckets,
mostly the privacy side.
Because that's what matters
most in the American political
debate, but actually
may not matter
the most in the long
term national security
and economic prosperity
of the nation, which
is weird and scary to say.
AUDIENCE: Just a question on
how Silicon Valley companies can
partner with each other and
with the government to actually
have better government
surveillance policies, right?
Recently, we saw the
government surveillance reform,
where like seven companies
have got together.
And again, it's going back to
the notion of us versus them
where instead of
partnering, it's
now they're pushing
for like reforms,
and wasting lobbying
dollars and stuff while it
could be a better partnership.
So what are you thoughts
on what we could do?
PETER SINGER: There's
steps that can be taken.
But one of the underlying
things this is attitude.
And it's funny, I was out
here a little over a year ago.
And there was sort of
an attitude towards,
DC is so dysfunctional.
Nothing could get done there.
You guys are so problematic.
We don't want anything
to do with you.
And we don't need
anything to do with you.
And then now we see the flip
side of that of actually
what you do matters to us.
You're still dysfunctional.
But it matters to us.
And in turn, you saw that
approach from-- again,
this is from the stovepiping--
Individuals pursuing
a certain political goal, and
within just a limited circle,
not understanding the ripple
effects of what they were doing
on lots of other
areas including one
of the cornerstones of
American prosperity, which
is our technology industry.
So the problem is first
knocking down that attitude
that neither side
matters to the other
and doesn't need to
understand the other.
Too often, Silicon
Valley-- and even sort
of the reaction when I
said this in the speech
--will offer a seeming
technologic solution
to a problem.
There's far more
engineers out here
than almost any other specialty.
And so there's
often-- you know, we
can engineer our way out of
it some way, shape, or form.
And we even see that now in
this discussion over privacy
where it was OK, we can't
figure out what to do.
But Attorney General and Head
of National Intelligence,
you've got 60 days to
figure out this solution.
And we see different
sort of things
offered that are sort
of a technical solution.
It's not going to be
a technical solution.
It's going to be an awful,
painful grind of policy
and votes and court
decisions and lobbying
and all these other
things that go
into the nasty sausage
of political process.
But in turn, what I'm
getting at, too often
we fail to look at the human
side of what can be done.
And that would be
another aspect of it.
But the bottom
line is we clearly
have a shared stake in it.
And I hope we can raise
the level of discourse
and raise the level
of cooperation.
AUDIENCE: Hi, Peter.
Great to see the book
finally come out.
What are you optimistic about?
PETER SINGER: I
thought-- I mean, look,
I'm actually hugely
optimistic about-- I mean,
the possibilities of this
technology, what it's allowed
to accomplish, and in turn
the people who misuse it,
and what they're
costing themselves.
And that misuse is
everything from-- there's
a very real danger of the
balkanization of the internet.
On the other hand, the cost
of that to those nations,
it will be staggering.
A flip way of putting
it is, there's
one nation that has been
really, really great
cyber security
protections-- North Korea.
There's a cost to that.
And we can see this in turn
on the debate around the NSA
to businesses.
I gave all of these anecdotes of
how they're not doing it well,
but now they're
facing cost to it.
The recent examples of, be it
Target or Snapchat or Neiman
Marcus, is that there's
an ebb and flow.
And people that
mishandle it face costs.
And so to me, that's
where we'll see reactions.
The incentives will drive it.
If there's any
message from the book,
it's that this is
seemingly scary stuff.
And some of it should be scary.
But on the other hand, we
can't have a good discussion
if it's like Spinal Tap and
the volume's always at 11.
Which has been how
we've talked about it.
And so the goal of
the book was basically
to fill this kind of sweet
spot where you either
had this highly
technical discussion that
was exclusionary or you
had the histrionic side.
And instead, I think
this can be a topic--
I think it has to be a topic
that we're all better equipped
to talk about.
And I'm optimistic that
when we do understand this,
we can go much further than
where we're at right now.
MALE SPEAKER: And on
this optimistic note,
please give a hand
to Peter Singer.
