[MUSIC PLAYING]
CORY DOCTOROW:
Thank you very much.
It's a pleasure to be back here.
Thank you for coming.
SPEAKER: Why don't you start
by telling us about the book.
CORY DOCTOROW: Yeah, I
mean, they're, in some ways,
my 2017 therapeutic writing
Trump mishegoss stories.
I didn't plan any of them.
They all just sort
of blurted out.
So "Unauthorized
Bread" is a story
about people in refugee housing,
where all the appliances are
designed to have DRM, to lock
them into vendor ecosystems.
So you can only toast authorized
bread in your toaster,
you can only store authorized
groceries in your fridge,
and so on.
And that is bad enough.
But then the hedge fund
that owns the back end
for all this stuff
financially engineers itself
into bankruptcy.
And so everything stops working.
And so they learn to
jailbreak their appliances,
out of necessity, and then
out of sheer joy of seizing
the means of information.
But then they learn that the
companies are being rebooted
out of bankruptcy, and that very
soon their telemetry is going
to detect that the
devices were jailbroken,
and then they're going to
face DMCA criminal liability.
And because they're
all in refugee housing,
that will mean being
deported and possibly killed.
And so it becomes this
very high-stakes fight,
where this woman who has
created this kind of Youth
Brigade of kids who go round
and jailbreak everyone's
devices now has to convince
these kids who are completely
foursquare against it to
go and restore everything
to factory defaults without
scaring the pants off of them.
They end up working
with friendly techs,
and trying to figure
out if there's
a way they can use VMs to
cheat the telemetry, and so on.
So it's a novel about the
kind of class dimension
and surveillance
dimension vendor lock-in,
and the way that that
plays out depending
on what kind of privilege
you have in the world.
And then the next story
is this story called
"Radicalized," the title story.
And it's about super-entitled
middle class white dudes
who watch their loved ones
die of preventable illnesses
because their insurers
won't cover therapies,
and who find themselves on
these dark net message boards
where they're radicalized
into being suicide bombers who
kill health care executives.
And a lot of it is about
whether or not America will ever
call a white dude a terrorist.
And some of it is about
how just the cause actually
turns out to be, and so on.
And also, this very toxic
observation that many
have made about
the incel movement,
that in normal support
message boards--
if you're a recovering
alcoholic and you're
in an online community,
all the elder
states people of
that community are
people who beat the thing
you're struggling with
and are now staying are
staying around to guide people.
But people who recover
from being in cells
don't hang out in
incel message boards.
So all the elder states
people of the incel message
communities are the most
toxic, most broken people.
And they're the ones
saying, yeah, get in a van,
and drive it
through the streets,
and kill as many people as
you can, they deserve it.
And so these communities
don't get better over time.
They get worse.
And that's the kind of
community that I'm exploring.
The third story's
called "Model Minority,"
and it's about a thinly-veiled
analog to Superman
intervening in a beating
by the same cops who
killed Eric Garner on Staten
Island, who then discovers
the fact that he's
viewed as both white
and a human are
super-contingent,
and that there are some things
that America won't tolerate
from the people who are
honorarily white and human.
And it takes the
form, in large part,
of Socratic dialogues
with Bruce Wayne, who's
a military contractor
who provides
predictive policing
software to the NYPD,
and who's in fact basically
responsible for all of this.
And then the last
story is a story
called "The Mask
of the Red Death,"
named after the Edgar
Allan Poe story.
And it's about
preppers who build
a luxury bunker just outside
of Phoenix and repair to it
as soon as the catastrophe
hits, and fancy themselves
living out a kind of boy's
own adventure of the elites
surviving while all
the useless takers die
in Phoenix because they didn't
have the wisdom to prepare
these bolt holes, but who in
fact end up dying of cholera
because the useful
people are the people who
stay behind in Phoenix and got
the sanitation working again.
And these ubermenschen
have to discover
that you can't shoot germs.
It was originally going to be
called "You Can't Shoot Germs."
But I think "Mask
of the Red Death"
has got the
appropriate gravitas.
So it's these four novellas.
They're all linked.
They wrap around a lot
of the same themes.
They touch on a lot of the
same geographic locations.
Some of the details recur.
So some of the
OPSEC that Superman
uses to avoid having his secret
identity outed by the NSA--
he's got a randomizer
that tells him
where to take off from so
that you can't draw a map
to see where Superman is
always sighted by ground radar.
So he runs very quickly
to somewhere else,
and then takes off.
And this prepper has
also got a randomizer
that makes him get in his
disguised, armored-up, like,
F-150 with a camper bed that is
full of all of his pepper gear,
and put some fishing
rods in the front,
and just drive it off
of his gated community
at this randomizer's interval
so that there isn't someone who
goes, hey, that's weird, why is
that guy driving his F-150 out
of this gate-guarded
community now?
He must be going to
his prepper hidey hole.
We'll follow him there, and
kill him, and take his stuff.
So a lot of us a lot
of the same things
kind of repeat and
repeat, and the themes
wrap around each other.
We're not using the word
"collection" for it.
It's just a fiction book.
And it's been likened by
my publisher to the book
that Stephen King wrote, where
the body-- the story that
became "Stand by Me" comes
out of "Four Seasons," that's
just four
thematically-connected but not
continuity-connected tales.
SPEAKER: Had any of the stories
been published previously?
CORY DOCTOROW: No, this
is all original stuff.
And as I say, it wasn't planned.
"Unauthorized
Bread" originally--
I sent it to my editor because
it was such a weird and awkward
length that I didn't think--
I didn't know what to do
with it when it was done.
It's 30,000 words long.
And I thought that he was
going to say, well, we
don't know what to do with us.
And instead, he sent
me an email, like,
the next day saying, oh my God,
this is amazing and timely,
we want to publish it in two
months as a standalone book.
And I said, well, that's
great, let's do that.
And we want to pay
you more than what
you got for your first three
novels for it, which was also
great.
And then and then I
sent him the next one
I sent him the
model minority was
like this is also
awesome we'll do it
the next month as a standalone.
I'm like, I've got
two more in the works.
So that's how we came to
wait about seven months
to do it and publish
them all in one go.
Topic, who are the TV and
movie studio associated
with the same company
that own "The Intercept"
have bought the TV
rights for this,
and are developing it now.
SPEAKER: And you
mentioned timeliness.
And reading
"Unauthorized Bread,"
it seemed very much of the now.
And the book also struck
me as being more political,
which is kind of a
funny thing to say.
Your work has always
been political
in one way or another--
freedom of access to
information, doing it yourself.
But this one seems a little
more pointedly political.
CORY DOCTOROW: Two
things have happened.
One is that politics have
just become more salient.
We are just in a moment
where you can't not
talk about politics,
for better or for worse.
The other thing is that
the political dimension
of information has now
become much more obvious.
I don't know if any of you
work in like UX or UI design.
There's this thing that happens
with UI design and UX, where
when you start, you're trying
to convey a bunch of ideas
that are really novel to the
people who you're conveying
them to.
If you think about it,
early GUIs, just the idea
that there is a file,
and then that a file is
a thing that you save, right?
And that you have reversion,
and you have Control-Z,
and all of those things
are kind of novel ideas.
Undo is not a
thing that you just
know about if you've
never used a computer.
Undo is a weird idea.
And over time, we get
better at conveying
what those ideas mean.
But also, the urgency
of conveying them
is diminished, because people
are meeting you halfway.
Now, just everybody,
people who don't
know what a floppy disk is know
that a floppy disk means Save.
And just that icon, it
doesn't need a tool tip
anymore the way it used to.
And in the same way,
writing political fiction
about information
doesn't require
nearly so much handholding.
You can just jump right in.
Because in an age of Black Lives
Matter, and mimetic warfare,
and questions about
whether or not
our power grid is vulnerable
to cyber attack and so on,
talking about
information as being
this important dimension
to our politics
is not a radical idea anymore.
And that means that you
can dig into more nuance.
The people can be centered
more because the information
dimension is just obvious.
SPEAKER: That's interesting.
It also make me remember
that it struck me
while reading the story that it
seems slightly more educational
than past fiction has been.
There are several sections--
it's not overly
heavy exposition,
but you go pretty detailed
into digital copyright,
kind of aside how
to jailbreak DRM.
It seemed like you were
educating through the story
as well.
CORY DOCTOROW: So I think
there's two dimensions to that.
One is like, I think people
are tolerant of exposition when
it's news you can use, right?
Like, oh, that's
how that works, now
a whole bunch of stuff in my
life suddenly makes sense.
Like, here's what a
VM is, or whatever.
The other thing,
though, is that there
is a longevity that comes
from writing fiction where
the technology is rigorous.
We've had a certain amount
of technological breakthrough
in all of our careers and lives.
But there are some fundamentals
that remain pretty fundamental.
Like, you know, complexity and
our understanding of complexity
theory and things that
scale, and order n squared,
and things that scale
in polynomial time,
and so on, those are things
that are just semi-immutable.
And so talking about the
kind of problems we can solve
and the kind of
problems we can't solve,
if you make the
plot turn on them,
the plot doesn't get
old in the same way
that it does if the plot turns
on how much RAM your phone has.
That's a moving target.
But like Turing completeness,
not a moving target.
Writing about the
fact that if you
try to design a
computer that can
run all the programs
except one, you
are trying to do something
that runs counter to the--
really the only functional
widespread architecture we have
for computers, which is the
one that runs all the programs.
And so it will always be this
weird, mashed-up Rube Goldberg
under the hood if you've
got a computer that
runs everything except one.
And it's always going to
have certain recurring
characteristics.
If you design a
computer that has
a mode that sits
below super-user mode,
like ring minus
one, where programs
execute that even the
administrator of the computer
isn't supposed to be able
to inspect or terminate,
then that will always be
his own that if someone
who is an attacker
gets access to it,
that they'll be able to
operate with total impunity
and undetectably against
you, because it's
a computer designed not to
introspect about processes
that are running in some ways.
And so this just
keeps coming up.
We keep designing
computers that have,
as part of their
security model, oh, there
are programs that the user--
even if the user is
the administrator
and owns the device--
can't terminate or inspect.
And inevitably, someone
who's a bad actor
starts running code in it.
Maybe the politbureau
orders Apple
to run processes that seek
out and terminate VPNs
that it can't spy
on for iOS devices
in China, which is a thing
that happened, right?
You know, this is the world's
most predictable outcome
of putting that gun on
the mantelpiece in Act 1.
But we're going to
continue to put that gun
on the mantelpiece in Act 1.
And so if you write
science fiction that
turns on the actual
characteristics of computers,
like, our theoretical
understanding computers,
that science fiction remains
futuristic for so long
as we continue to make dumb
policy choices about computers.
Which to my great
dismay, probably
means that the fiction will
remain current in some way
forever.
"Little Brother" is still
being taught and read--
I wrote it 13 years ago now.
And the question
of whether or not
subjecting whole
populations to surveillance,
or using machine learning to
make inferences about guilt,
or any of these other things
creates a bunch of pathologies,
those facts are still
totally in evidence.
They show no sign of going away.
We have totally failed to
learn the lessons of them.
The stakes only get higher.
And every year, there's
another news hook
that makes people
read "Little Brother"
and then come back to
me and say, oh, this
is just like that thing.
How did you anticipate
13 years later
that we'd been doing this?
And I'm like, I didn't.
We were doing it 13 years ago.
We just are literally
still beating our heads
against that wall.
We are figuratively
still beating our heads
against that wall.
We will never stop.
And so "Little Brother" will
remain current and relevant
for so long as we are stupid
about technology policy.
SPEAKER: You mentioned
the word, root.
What was the initial
idea for the story?
Did it start with the
phrase, unauthorized bread,
or did it come from
somewhere else?
CORY DOCTOROW: Yeah,
it actually started
with a short story I wrote as
part of my Guardian column.
So I'd been arguing
for a long time
with people about iOS and
the App Store business model.
And there was this "it just
works and I trust them"
element to people saying, I want
to work within an ecosystem,
and I trust Apple, and I don't
care if they have made choices
about what I can and
can't use, because I think
those choices are good ones.
They're good proxies
for my interest.
And I was trying to tease out
the difference between having
a checkbox in your OS that says,
I would like to do something
that the manufacturer
hasn't approved,
and having legal liability
attached to figuring out
how to reconfigure
your OS to do something
that hasn't been approved.
And what kind of
bad things crop up
if it's actually a
felony to modify your OS
to do things that the
manufacturer doesn't like.
And so I wrote a
little story called
"If Dishwashers Were iPhones."
And it was in the
form of an open letter
from a Steve Jobsian CEO of a
next-generation IoT dishwasher
company called Disher, who make
an appearance in the story.
And he was explaining
that food-borne illness
has killed more people than any
other killer in human history.
And how can you
expect your dishwasher
to be truly effective
at keeping you safe
and making your dishes clean
with the least amount of water
in an age of scarce
resources if you
are able to put any
old dish that you
want in your dishwasher?
And that is why you
shouldn't be bending
the prongs of your dishwasher
or trying to take the RFIDs out
of the dishes that you
bought from the Disher store
and putting them in
grandma's China, and so on.
Because you could buy
a different dishwasher
if you wanted that.
And the deal that you got when
you bought this dishwasher
was that you would
only wash dishwashers
from The Kitchen Store.
And The Kitchen Store works
only with licensed partners
who make high-quality gear.
And anyone who wants can buy
a $99 Kitchen Store license.
And provided they comply
with the terms of service,
they can make pottery that you
can put in your dishwasher.
But it's not
dishwasher-- the fact
that it says dishwasher safe
doesn't mean it's Disher safe.
It's only when they're
in our developer program
that we can tell
you that you truly
won't die of listeria
if you use their dishes.
And these are all,
taken individually,
not unreasonable statements.
But they gang up to inkjet
printers for everything.
Everything is tied into some
ecosystem where you can't--
it's not that you do trust them,
it's that you must trust them.
And if they ever make a
decision that you don't like,
you are out of luck.
And the longer you go inside the
ecosystem, the more sunk cost
you have, and the harder it is,
the more you have to give away.
The more the switching
cost is if you decide
that you no longer trust them.
And so you end up being
beholden to a series
of commercial decisions being
made in boardrooms that you
have no insight into.
And you just have to trust
that no one in the firm
will ever do anything that
runs counter to your interests.
And I'm not an
anti-Apple person.
I actually have a sad
Mac tattooed on my arm
from when I used
to be a CIO and I
used to order a million dollars
worth of Apple gear a year.
But as someone who's bought
a fair number of Apple lemons
and written POs for a fair
number of Apple lemons
over the years, and seen how
the company can be wildly
imperfect, and will be
wildly imperfect again,
I think the idea that
you are trusting them
in this way that requires
that you trust them
and that you can't transfer
your trust out actually
invites future
leaders of the firm
to make choices in the
knowledge that they
can betray your
trust, and you're not
going to be able to leave,
that they've got
a lot of headroom
in terms of betrayed trust
before you get to the point
where people are going to give
up a whole ecosystem of devices
and replace all of it.
And certainly in some
customers, maybe never.
And I saw this not being
looked on with horror
by the rest of the world,
but being looked on as kind
of an excellent idea.
So you have, for example,
Johnson & Johnson
getting approval for
an artificial pancreas.
This is a closed loop, a
continuous glucose monitor,
and an insulin pump with
some machine learning
to try and time the insulin
dosing to keep your blood
sugar within a safe range.
And it uses proprietary
insulin cartridges.
And making a tool to
refill those cartridges
is a potential DMCA violation
with a $500,000 fine
and a five-year prison sentence.
And revealing a defect
in it, if that defect
would help someone
bypass the DRM
is also a potential
felony under the DMCA.
And one of the things we know
is that security researchers
are chilled from coming
forward with reports of defects
when there's a potential DMCA
overlap or a Computer Fraud
and Abuse Act overlap.
And so now you have this
spreading attack surface
of devices that,
because they're designed
to be extractive of
their users, are also
off limits to
independent scrutiny.
And they are they're more
and more intimately connected
to our bodies.
And so your car is a robot.
You put your body into--
that then goes down
the road at 60 miles an hour.
Or if it's like the 5, that goes
down it at five miles an hour.
And I'm not talking
about a self-driving car.
I'm just talking about a car.
Because you take all the
informatics out of a car and it
becomes inert.
The most salient feature of
that car is its informatics.
Compromising those informatics
allows the compromiser
to wreak great havoc on the
person whose body is trapped
inside this fast-moving robot
and on the people around it.
And so I wanted to illustrate
this idea that as our property
interest in the things that
we own is being eroded,
and as our ability to
independently scrutinize them
are being eroded,
that we are also
magnifying all of the imbalances
and inequalities in our world,
and that it happens
first to the poorest
and least powerful among us,
but it spreads to everybody.
The user adoption curve
for controlling technology
is like refugees, prisoners,
children, poor people,
blue collar workers,
white collar workers.
That's the adoption curve.
And if you want to know
what your life is going
to look like in 20
years, just look
at what we're doing
to refugees today,
and that's the technology that
people will expect you to use.
People didn't have
CCTVs recording
them non-consensually,
operated by third parties,
unless they were
imprisoned in our lifetime.
Now it's ubiquitous.
Illustrating that and using
this specific group of people
to illustrate it was an
intervention in that, a way
to try and interrupt that.
SPEAKER: Well,
let's dig a little
deeper into that
idea of inequality.
It's a major part of this story.
The main character, Salima,
is an immigrant, a refugee.
The people that around her are
largely poor and underclassed.
What do you think
that brings the work?
CORY DOCTOROW: One of the most
salient facts of our moment
is inequality, in part because
it breaks apart the story
that we've told about markets,
which is that markets are
a dynamic way of finding people
whose ideas would create more
general prosperity, and
allocating capital to them
so that they can do it.
And when you see widening
inequality and stagnation
in our social relations,
you're left with either one
of two conclusions, right?
Either markets aren't working
the way they're supposed to,
or eugenics is
real, and there is
a 1% of people who are so
smart that they should just
own everything.
And if we would just
give them all the stuff,
they will allocate
capital so wisely
that we will get richer
and richer and richer.
And you know the
history of antitrust
is full of counter
examples, right?
Like when we broke up the phone
company into six companies,
they got bigger in aggregate
than they were as one firm.
When they broke up the
railroad, the railroad monopoly
into two companies, each
one within a few years
was as big as the
parent company had been.
The disefficiencies
of scale are actually
pretty well understood,
but as Upton Sinclair said,
it's impossible to get someone
to understand something
when their paycheck depends
on them not understanding it.
And if you are
someone who benefits
from the concentration of
wealth in a single firm,
the fact that we as a
society will benefit
more if that firm were
broken up into smaller pieces
and each of them was allowed
to compete in and grow
is not very relevant
to you, because it
means that you get a lot fewer
ivory handle back scratchers.
And so I think that one of the
corrosive effects of inequality
is that it drives rich
people into believing
that they have good blood,
into believing in eugenics.
You actually heard this in
the last election cycle.
Trump repeatedly talked
about his good blood.
For people who remember the
horrors of the Holocaust
or Tuskegee or the
eugenics movement,
good blood is a scary thing
to hear someone talk about.
It's like the smoke from
a horrible fire that
is smoldering, and
I think that know
we're blowing on those coals.
And so by writing
about people who
would be really efficient
capital allocators, who
are doing really interesting
dynamic, exciting things that
benefit other people, but who
are denied access to capital,
one of the things
that you do is you
start to change the
narrative we have about
whether capital
allocation is efficient,
and whether inequality
means that we are actually
just taking random
lottery winners
and heaping enormous
riches on them
to the detriment of their
soul and to the detriment
of our society.
SPEAKER: Your Twitter
handle currently
is "son of an asylum seeker,
father of an immigrant."
CORY DOCTOROW: Yeah.
SPEAKER: Do you feel a personal
connection to that experience?
CORY DOCTOROW: I do.
You know, I lived in England
until three years ago.
So I was in the UK in
the run up to Brexit.
And I would end up
in a lot of cabs,
and London cab drivers
are notoriously right-wing
and notoriously gabby.
And given that the impending
crisis was about migration
and this debate was going
on about migration, at least
on a weekly basis
someone would start
talking to me about asylum
seekers and migrants.
My dad was born when
his parents were
living in a displaced
persons camp in Azerbaijan.
They were Red Army deserters.
My grandmother had
been a child soldier
in the siege of Leningrad.
They came to Canada
as asylum seekers,
and then I'm an immigrant.
I'm a Canadian who then
lived in the United Kingdom,
and now I live here.
And my daughter is an immigrant.
And I found that
it gave me a place,
a way to enter that
conversation and say
I'm the person
you're talking about.
That's me, one generation.
And they'd say, oh, but
you're the right kind.
You're high skilled.
You're whatever.
You're white.
And I would say,
so my grandparents
were not high skilled.
My grandmother stopped her
formal schooling at 12.
My grandfather
stopped his at 14.
They came to Canada
without marketable skills.
They learned skills
when they got there.
And they had to display
an awful lot of pluck
and get up and go to get
from Azerbaijan to Canada.
The winnowing function
they went through
was not passing a
standardized test
or displaying a credential.
It was crossing Europe.
That is an awful
lot of gumption.
By the time you get
to the port of Hamburg
and you present yourself to a
Canadian immigration official,
you have already demonstrated
an enormous amount of pluck.
And they were
traumatized, and they
had lots of problems
in their lives,
but they were also people
who came to contribute
and whose families contributed
through the generations.
And so as the debate about
migration and asylum seekers
and immigrants took off here, I
felt like my passing privilege.
I have an accent that sounds
like it could be from America.
I have a skin tone that makes
me look like I could be white,
even though the
people I come from
were not thought of as white
when they got to Canada.
They certainly are white now.
And so you could think
that you were talking
to someone who wasn't
an asylum seeker, who
wasn't the father of a migrant.
As they say, Canadians
are like serial killers.
They're everywhere, and they
look just like everybody else.
I wanted to wear it on my
sleeve to make people confront
that they were among people
who were the kind of people
they were demonizing.
SPEAKER: In the
beginning of the talk,
you mentioned how the stories in
"Radicalized" aren't connected.
CORY DOCTOROW: Yeah.
SPEAKER: But you said
that the role of place
mattered in the stories.
This story takes place
in suburban Boston.
Why?
Why Boston?
CORY DOCTOROW: Although
it touches Phoenix, right?
Phoenix is in all the stories.
So Boston in part because
I knew it reasonably well.
I'm an MIT Media Lab
research affiliate.
You and I hung out in Boston.
I stayed in your living
room in Boston once.
And Boston's a really
interesting town
in that it's like
it's a crossroads
for a bunch of different kinds
of industry and activity.
So it's obviously
an academic hub.
It's a tech hub.
It's now a biotech hub.
But it's also a
light industry town.
It's a port.
It has all of these
different contradictions.
And it's similar to LA.
It's not like the Bay Area,
which although the Bay Area has
a bunch of industries
in its history,
it has been completely
eclipsed now.
There's just one thing
that it's known for now.
Whereas Boston still is
this very diverse place.
And Arizona I'm
really interested in,
because on the one
hand, it's a place that
is not going to survive
climate change gracefully.
It's like it's really in the
crosshairs of climate change.
But it's a red
state, so it's also
a state where climate
is officially denied.
But it's also a state
that is majority minority
and but for a little bit
of gerrymandering here
and a little bit of
voter suppression there,
it would be a place that would
have a very different politics.
And it's also a retirement hub.
So it's a place that's full
of people who aren't Arizonans
making claims about
their native rights,
their indigenous
rights as Arizonans,
even though they're
all transplants.
And so the contradictions
of Arizona are so vivid.
And I'm an advisor
to a center at ASU
that does science fiction to
talk about other disciplines.
And so I go to
Phoenix periodically,
and also it's my hub because
I fly to Burbank Airport.
I live in Burbank.
And so everything starts
Phoenix and then somewhere else.
So I really was
interested in the role
that Arizona plays in the
future of our politics.
SPEAKER: And it's
interesting leveling effect
if you talk about of
redefining who we presume to be
a refugee or an immigrant.
Both of those places
are also places
that, despite their diversity,
we don't necessarily think
of as an immigration hot zone.
CORY DOCTOROW: Yeah.
SPEAKER: I think we'll
see more places like that.
I mean Greenpoint, Brooklyn
is still a predominant polish
entry point in the country.
Growing among
population in Wisconsin,
what do you think that
means to the sense of place
and the diversity of
those communities.
CORY DOCTOROW: So this is a
really interesting question.
We have a story about
America about something
between assimilation
and ladder kicking.
Once there were
Poles who were not
thought of as white, or
American, or Italians.
And then they assimilated, and
then they became respectable,
and then they kicked
the ladder away,
and they turned on the
people who came after them.
And there's a certain
amount of truth to that.
Some of that,
though, is associated
with economic dynamism.
So one of the things
that made people
go from being other to being
accepted as fully paid up
Americans was an extraordinary
degree of social mobility
at various times in
America's history.
And Thomas Piketty in his book
"Capital in the 21st Century"
talks about America's
dynamism, and he chalks it up
to these reset events
that America had in terms
of its wealth distribution.
So in before manumission,
the vast majority
of American wealth
was in people who
were claimed as slaves by
people who had enslaved them.
The gross national
wealth of America
was primarily in human bodies.
And so manumission, in addition
to doing a lot of other things
politically, had what was this
huge economic moment in that
the greatest concentrations
of wealth in America
ceased to be
considered as assets
and became human beings, at
least as a legal fiction,
notwithstanding Jim
Crow and whatever.
But one of the
effects of that is
that the grip that
wealthy people had
on political outcomes completely
changed after the Civil War,
because they just didn't
have as much money to spend.
Their wealth was
radically diminished.
And then it happened
again during the crash.
And so the war years and also
the crash after the Gilded Age
completely leveled out
the wealth distribution.
So the amount of wealth control
by the top decile of Americans
just nose dives twice in
American industrial history,
and it only happens
once in Europe.
It only happens
with the war years,
with the two wars and
the interwar years.
And when the capital is
more widely distributed,
you have more
pluralistic choices,
more pluralistic decisions,
which creates things
like better social safety net,
better quality of schools,
cheaper and wider access
to education, and so on.
And this lifts up
lots of people.
It helps people enter a kind
of middle class respectability.
And as we know from
the crisis of 2008,
the first people
who get jettisoned
when the economy starts to sink
are the last people who got in.
And of course there was
still historic problems
with this in terms of
redlining that denied access
to African-Americans to
that shared prosperity.
But nevertheless, you got people
who became officially American
instead of others, instead
of perpetual immigrants.
And so there's this story that
goes, if you come to America
and you just hang out,
your kids or their kids
will be Americans.
They'll cease
thinking of themselves
as Polish Americans
or Italian Americans.
But really what we're saying
is, they'll become middle class.
And right now, it's
looking like that's
ended, that you don't become
middle class if you show up
in America with nothing but
the shoe leather on your feet.
You just stay outside
of wealth accumulation,
outside of social mobility,
because all of the policy
levers that were used to allow
people who had aptitude to gain
access to education,
and capital,
and to start businesses,
and so on, those
have been snuffed
out one after another
as the concentration of
wealth at the top has grown,
and as they have more
policy levers to yank on
to increase the
concentration of their wealth
at the expense of everyone else.
And if you think
about it, the law
of large numbers, the
law of small numbers,
making the wealth of someone
who owns almost everything grow
by even 1% is hard.
Making the wealth of someone
who owns almost nothing double
is easy.
That's why startups
have incredible growth.
We went from one user to two.
That is a very
impressive growth number
expressed as a percentage.
It will take a lot
longer for Google
to double its users
than the startup that
goes from one to two.
And so if you're making the
rich, the super rich richer,
even a little bit richer,
measurably richer,
it has to come at
the expense of nearly
the total net worth of
a huge number of people.
That's why that Oxfam number
of the number of billionaires
in a bus it would take to
represent half the world's
wealth is so shocking,
because it doesn't just
mean that they got richer.
It means that a giant number of
people had to get much poorer.
And so anytime
you see the wealth
of the top decile or
the top percentile
or the top 1/10 of a percentile
increasing measurably,
you know that it's not
coming from growth.
I mean, there's some
growth, but it's not
coming primarily from growth.
It's coming through
redistribution.
And that redistribution
is upwards,
and it's vastly asymmetrical.
Everything you own going to make
a tiny difference to someone
who already owns
nearly everything.
SPEAKER: We've long
thought that technology
will help wear away the
differences between class,
between race.
And in the story, the poor
have access to technology,
but they don't have a
choice of the technology
that they can access.
And there's a moment
in the piece when
I laughed, where when
asked what kind of toaster
the startup employee
had, he said, well,
not that kind of toaster.
And so the better off could
choose what tools they use.
CORY DOCTOROW: Sure.
SPEAKER: Could you
talk a little bit
about that element of choice?
CORY DOCTOROW: Yeah,
I mean, I think
that there is a common pathology
among technologists, especially
those who work on restrictive
technologies, which
is that we never imagine
ourselves using them.
We imagine them being
deployed on others.
We will always have root, right?
We will always-- we
were just talking
before we came in
about this story
today about YouTubers being
extorted by fraudsters who
register fake copyright strikes,
the fraudulent copyright
strikes against their
YouTube accounts and then say
give me $150 or I'll put
a third strike on you,
because everybody knows
you can't get anyone
at YouTube on the phone
when you've had a copyright
strike against you, because
literally anyone who
gets a copyright
strike against them
thinks that it's
illegitimate, right?
So YouTube doesn't have
any great way to triage it
and you just get stuck in
these support email loops.
And there isn't a
support choice choice
for "I'm being extorted
by a petty grifter who
wants a millionth of a
bitcoin" to get my copyright
strikes removed.
And so since that doesn't
exist an email loop,
you can't get help.
But if you are me and
you know googlers,
that would never
happen to me, right?
I would just call
a googler friend.
Fred Von Lohmann, who
runs copyright for Google
with a couple other people
used to work with me at EFF.
If I could just call him, and
he would sort it out for me.
So when we're in
these situations, when
we design these
systems, we always
know that they're
not going to bind us.
I worked on this DRM
standard, trying to fight it,
for digital television
called the Broadcast Flag.
And it's a crazy
idea, and I don't
want to go into the whole
thing, but part of it
was that all general
purpose computers would
have to only have outputs that
were approved by movie studios.
But there was a professional
tools exemption.
And so there were people from
all the big tech companies,
and people from all the big
movie studios, and people
from all the big
consumer electronics
companies in the room.
And as soon as they
"said, there'll
be a professional
tools exemption,"
everyone's like "yeah,
of course there'll
be a professional
tools exemption."
You think I'm gonna
use this stuff?
No way!
Right?
I'm gonna have the version
of this that isn't terrible.
The terrible version
is for other people.
And so trying to
tease that out, this
idea that when we
make technology
that we would never
want to use, we're
dooming lots of other
people to using it.
A recurring version
of this is every time
people say, "well, you
know all those people who
work in Silicon Valley don't
let their kids own phones
or watch YouTube or whatever.
Or they all have ad
blockers installed."
All the other things
that we know about
technologists, and
I'm guilty of it too,
that civilians don't have
access to or don't get to do.
SPEAKER: It also
reminds me of what
we were talking about
before the talk in terms
of just how beautiful it is
when something just works.
CORY DOCTOROW: Yeah.
SPEAKER: The book also
addresses kind of functionalism.
There's a couple of
neat lines in the story.
"This was a new kind of toaster,
a toaster that took orders
rather than giving them."
And then also "if someone wants
to control you with a computer,
they have to put the computer
where you are and they are not.
So you can access the
computer without supervision."
Can you talk a little
bit about functional
and how we might be moving away
from that or closer to that?
CORY DOCTOROW: So the nexus
of control is the key thing.
The difference in
utopia and dystopia
is, who's got their
finger on the button.
Having a Fitbit that tells you
how many steps you've walked,
leaving aside the problems
with the sensor and the fact
that it might think you've
walked 1,500 steps before you
get into bed, but
having a Fitbit that
helps you track your own
fitness levels, maybe
you could have a pathological
relationship with it,
but there's nothing wrong
with you knowing more
about yourself.
Having your boss
put a Fitbit on you
and say, "if you don't
get your 10,000 steps in,
we're gonna take you out of
the company health plan"?
It's the same technology.
It's just a different
nexus of control.
And if there's a
mechanism in the Fitbit
that detects spurious
sensor events in order
to stop the Fitbit from telling
you that you've walked 10,000
steps when you've really
only just gotten out of bed,
that's great.
If that same sensor is
used to stop you putting
your Fitbit in a sock and
stick it in the tumble dryer
to fool your boss into thinking
that you've done your 10,000
steps, that's terrible.
So the best example
of this, Doug Rushkoff
just wrote this column about
going to this some hedge fund
conference.
And one of the
panels was on what
to do in the event,
when the collapse comes
and the poors come to eat you.
They were talking
about this problem
that they would have when
currency collapsed, which
is how do you stop your
guards from killing you
if they don't need you
to pay their paychecks.
They said, well, we've
got to sort it out.
We're going to these two-factor
biometric and secret food
lockers.
And so they'll have to keep me
alive to unlock the food locker
so they don't starve to death.
And that's going to be the thing
that keeps my guards in line.
And I know people, and I
have been on many diets.
And sometimes those
diets involved
not eating certain things.
And if I put a lock on my
fridge that I could control
that didn't open until
it was lunchtime,
literally the same technology
the only difference
is who's pushing the button.
The difference between
utopian and dystopia
is who gets to decide.
It's where the choice is.
And so these people are
moving from a situation
where the technology that
they're in is a stricture,
and it becomes an enabler.
And I think that there's a
story that, like going back
to the dishwashers and
the Steve Jobs figure,
there's a story
that we tell about
technological inevitability.
We have to make it impossible
and illegal for you
to reconfigure your
devices, or they won't work,
or they won't keep you safe.
And we talk about
it as though it's
like someone came down off
a mountain with two stone
tablets that said that.
When we talk about online
surveillance, ad tech,
and predictive markets,
and machine learning,
I am old enough to
remember when we rotated
our server logs
instead of mining them
for market intelligence.
There's no reason.
Nobody will take
away your CS degree
if you start rotating your logs
again instead of saving them.
That is a choice someone
made, but we disguise it
as a technological
inevitability.
And so by showing that
when the nexus of controls
changes, the nature of your
relationship to the technology
changes, and the thing
that you didn't like
about the technology
turns out to be a thing
you didn't like about the
nexus of control, that is,
I think, again,
a powerful lesson
to encourage people
to understand STEM
and to become masters
of their technology.
We talk about STEM education
as though it's just
part of the normal
grift of, if you
don't get your kids
to do this and give me
money to teach your
kids to do this,
your kids will be
economic roadkill.
And there's certainly a
certain element of that,
but again, I think the origins
of the STEM Education Movement
are about digital self-defense,
not about creating
an industrial workforce.
Program or be programmed.
Learn how to use the technology,
or the technology will use you.
And that empowering
message of technology
is one that I really
firmly believe in.
I don't want to get
rid of computers.
I just want to change
how we use them.
SPEAKER: I want to talk a
little bit about the audio book.
CORY DOCTOROW: Yeah.
SPEAKER: So this novella
came out yesterday.
The other stories are
coming out March 19.
And the voice actor
who did this one's
pretty interesting just herself.
And I want to ask you about her.
Lameece Issaq is a Palestinian
American actor and playwright
and founded a production
company dedicated
to Middle Eastern theater arts.
How did you connect
with her and get
her involved in the project?
CORY DOCTOROW: I'm lucky enough
that I've been really involved
with the production of my
last several audio books.
We record them at
a studio near me--
they're in Hollywood--
called Sky Boat Media.
So they consult me on
casting choices and so on.
And my hard and
fast rule with them
was that I wanted a voice
actor who was of Arab origin.
And I wanted a woman,
because the narrator
is a woman of Arab origin.
And I have a friend,
Lexi Alexander,
who's a very interesting person.
She's a director now, but she
was a world champion kickboxer.
She's a German
Palestinian woman who
was brought here by Chuck
Norris to help train the army.
She became a stunt woman,
then became a director.
She directed "Punisher."
She directed a
couple "Supergirls."
She does all kinds of stuff.
And one of the things that
she's always on about on Twitter
is all of the talented
Arab American woman
actors and other
actors of color who
could be doing the roles that,
for reasons that completely
baffle me, are not
going to people
who they would be suited for.
And so I asked her
for some names.
None of them could do it.
But then I went to the
directors, and I said,
we really need to cast an
Arab-American woman for this,
or a woman of Arabic origin.
And we got the demo reels
for four or five of them,
and Lameece was the best
one, and she was available,
and the rest happened there.
I was in the studio everyday
when we were recording.
She did an amazing job.
SPEAKER: It's a wonderful listen
as well as a wonderful read.
CORY DOCTOROW: Yeah.
SPEAKER: Questions
from the audience.
CORY DOCTOROW: If we can
alternate between people
identify who as
women and people who
identify as men or non-binary
and can go at anytime,
but that way it's
not just dumb dudes.
AUDIENCE: So a lot of
science fiction writers,
I'm thinking of like Ray
Bradbury and Isaac Asimov
and so on, write
a lot of stories
that come to imagine a
shared world of the future.
And they build up this kind of
consistent world of the future.
Ray Bradbury did that
with "Martian Chronicles,"
and Asimov with the
"Foundation" and so on.
I'm just curious if you see
yourself stumbling toward that,
or you might be doing that on
purpose, or that kind of thing.
CORY DOCTOROW: I like
the way Bradbury did
it more than the way Asimov
did, because they're not
all in the same continuity.
There are things that
happen on Mars in one story
that if they happen, then the
next story doesn't make sense.
AUDIENCE: Right.
CORY DOCTOROW: But where it
makes sense to overlap them
he does.
And another writer
who does that really
well is John Varley, who
has this whole long cycle
of stories that he's been
writing since the '70s.
They're amazing.
I think they're called
the Six World stories.
Earth becomes uninhabitable.
We're living everywhere
except Earth.
And when continuity makes
sense, he's got continuity.
And when it doesn't, it doesn't.
And I love that because these
aren't alternate history.
These aren't history.
They're not instructions.
They're not predictions.
They're artistic works
whose effect is--
yeah, some continuity is
part of the effect of it,
but the themes of
the characters,
and the kinds of
characters they are,
and the kinds of
things they get up
to are more important than
making sure that if someone
does something in
one story and then
they appear as a big
character in another story
that it still crosses over.
I think of the way that
reboots of the shared
worlds of the comic
book companies
have worked, where when it
makes sense, it makes sense,
and when it doesn't, it
doesn't, and they can have
multiple parallel timelines.
Does anyone think
Batman would be
better if we were
all in continuity
since the first
Detective Comics?
The reinvention is
actually super important
and part of the way that
we tell stories anyways.
If you look at, say, all
the different versions
of the Icelandic stories
or the Norse stories,
there are multiple
irreconcilable versions
of what the Norse did.
There's two versions of
Genesis in the Bible.
The first two
chapters of Genesis
tell completely opposite,
non-reconcilable, mutually
exclusive stories of
where the earth came from.
[LAUGHTER]
AUDIENCE: So you see
yourself as interested
in a thematically
consistent future
but not necessarily a
logically consistent future.
You might say it that way.
CORY DOCTOROW: And I'm
a panzer, not a plodder.
I'm not someone who does a
lot of card tricks in the dark
to try and figure out
how the plots going work.
I figure it out as I go, so
it doesn't really lend itself
to that kind of continuity.
Who's next?
AUDIENCE: What you were saying
about digital self-defense,
I was wondering, do you see
that as actually being doable?
Because I'm pretty tech savvy,
and at home I run Linux,
and I have my hard drives
encrypted with locks and so on.
But there's no way
I could possibly
run down all the attack
factors, even on my own machine.
CORY DOCTOROW: Sure.
Yeah, and I think that there
is a story we have about what
the early days of the
internet liberation
movement, techno-politics
movement was that I think
is wrong, but it's got a
little kernel of rightness.
The wrong version is, we used to
think that technology couldn't
be possibly used to do harm,
and so we wanted everyone
to use technology,
and we didn't care
how it was used because we
were sure that it would just
make everyone's life better.
That's clearly not true.
You don't found Electronic
Frontier Foundation
because you think everything's
going to be great.
You found it because
you're worried
about how terrible it will go
if it doesn't turn out great.
You see on the one
hand, the promise,
and on the other
hand, the peril.
But the cipher punk
movement particularly
did have this idea
that you could,
through encrypted
communications,
create a parallel universe
where even if you lived
in a totalitarian or
unaccountable system
where people could
operate with impunity
and do terrible things to you,
that because the ciphers worked
and the keys couldn't
be brute forced,
that you could in some way
resist the state indefinitely.
You could have a demi-monde
that existed alongside of it.
And I think that not every
cipher punk felt that way,
but a lot of them did.
And I think that that's wrong.
But I think that what we've
learned about encryption
and its relationship to
unaccountable authority
and illegitimate authority
is that encryption,
it's like a stopgap.
The technology is
a stopgap that you
can use to organize
and to resist coercion
while you figure out
how to make the power
structures that you're
organizing about
and that you're worried about
coercion from more accountable.
But technology
alone can't do it.
But if you try to
imagine the inverse,
imagine creating a
political movement
that holds power to
account and demands
accountable and legitimate
exercise of authority,
but that doesn't use computers.
You find each other
by, I don't know,
stapling photocopied
posters to telephone poles.
And imagine how easily you would
be outmaneuvered by the force
that you're trying to resist.
It seems obvious to me that
the power relations dynamic
that we're in now
is using technology
to open a space to make a
political change that gives us
the space to make technology
that opens a space
to make political change.
Lather, rinse, repeat.
And you don't
always win, but not
to just create this demi-monde.
And the good news is that
people's direct experience
of the way technology can
be a force for liberation
and the way that technology when
abused can be a force of terror
means that people
are actually caring
about the problems a lot more.
You can make an analogy
here to climate change.
If you were the world's
greatest recycler,
it wouldn't stop climate change.
There is nothing
you can personally
do that changes climate
change, that will change
the facts of climate change.
But you and everyone
else you know
and everyone they know
all working together
can do something
about climate change,
including collectively
recycling and making
a lot of other
choices, including
choices about how
we invest and maybe
a Green New Deal and so on.
And for a long time,
our biggest problem
was convincing people that
climate change was a problem.
And we see today that even
among the people who've
been historic climate
deniers, that's not really
a problem anymore.
We are past the point
of peak indifference.
But the reason we're past the
point of peak indifference
is that the number of people
for whom the reality of climate
change is undeniable because
they or someone they love
has had their lives harmed or
destroyed by climate change
is only growing.
Which means that we
moved from the problem
of convincing people that
climate change is a problem.
Now, we have to convince
them that it's not too
late to do something about it.
And I think we've gone
through that same trajectory
with technology,
convincing people
that how we regulate
technology matters,
that's taken care of itself.
The largest petition
in internet history
is the one to overrule
the electoral college
and make Hillary president.
Trailing it by 1% is the
petition from the European
Union to not pass the
copyright directive that
would mandate content ID
for all public platforms.
And that's crazy.
The number of people
who pay attention
to American presidential
politics versus people
who care about
whether or not we're
going to have upload filters
is now nearly at parity.
So now we just have
to convince them
that it's not too late
to do something about it.
And the doing
something about it is
the making the power
accountable and so on.
Not that we should ever be
storing passwords in the clear
because everyone's accountable,
but that we can store passwords
in an encrypted
form and also assume
that the secret police aren't
going to show up and make
you put a backdoor in to
decrypt those passwords
so they can man the middle
or otherwise disrupt
the communications
of your users.
And that's the
cycle that we're in.
AUDIENCE: Thank you.
AUDIENCE: So
obviously this country
has corporations organized
based on a profit motive.
Corporations have
to be accountable
to their shareholders and
increase products, et cetera.
We've been lucky and unlucky
with the results of that
in the grand scheme of things.
Obviously, from my
perspective and I'm
sure a lot of people here
at Google's on the good side
of that, hopefully.
We'll see, I guess.
CORY DOCTOROW: A lot
of the time it is.
I think you're right.
AUDIENCE: But there's
nothing stopping
the way things are going from
going badly as well, even
for this company.
CORY DOCTOROW: Sure.
AUDIENCE: Do you
think there's a way
to realign the incentives
at a high level
to make that less likely, to
make the good outcomes more
likely, let's say?
CORY DOCTOROW: I want to talk
about that European Union
copyright directive,
because it's
a good example of firms that
are generally good firms,
or firms that have
done a lot of good,
that are doing bad and doing it
in a weirdly dysfunctional way.
So one of the things that's
weird about the copyright
directive is that
the record labels are
super in favor of it, and
the movie studios are super
opposed to it, but they're
the same companies.
Literally, Universal
Music wants it,
and Universal Pictures doesn't.
And they're both
sending open letters
to the commission of
the European Union,
demanding that they be heeded.
So this is actually a
pretty common problem
in the theory and
history of anti-trust,
that beyond a
certain scale, there
are massive
disefficiencies of scale.
And you've probably encounter
them working in a large firm.
There's a thing that's good for
the firm as far as you can see,
but it gores the ox of someone
who's got a lot of power
within the firm.
I had a personal
relationship with Flickr
when it was founded.
I was an alpha tester for
the game that it came out
of, "Game Neverending."
And I was carrying on a
long distance relationship
with a woman who's now my
wife, who lived in London.
I lived in San Francisco.
We were both alpha testers,
and Stewart Butterfield
and Caterina Fake, who created
the game came to San Francisco,
and we had lunch.
And they said, how's it going.
And I said, it's
great, but we have
trouble sharing our images.
And they said, oh, we've
got that coming in the game.
We'll just accelerate it
in the product roadmap.
And three months later,
they shut the company down
and renamed it Flickr just
around that one photo sharing
thing.
So I feel really
close to Flickr,
and I watched really
carefully what
happened when Yahoo bought it.
And Flickr was the first mobile
social photo sharing app.
And so it had the power
to be a really big money
spinner for Yahoo.
But it gored the ox of Yahoo's
nascent mobile division,
Yahoo's nascent social division,
Yahoo's nascent photo division,
and so on.
And the great
beasts of Yahoo who
had the ear of the
senior management,
who had assembled power
structures around them,
were able to head off
and starve Flickr,
so that now it limps along.
It's just been
bought by SmugMug.
It remains to be seen
what its future is,
but it represents
this failed promise.
And this is a really common
disefficiency of scale.
These firms get bought
up by other firms,
who then just poison them or
do terrible things with them.
As I say, I think
Google has done
a lot of good in
this world, but I
think that effectively
burying the Deja News
archive of early
Usenet, it's something
between a crime and a shame.
That's a really important
piece of history,
and it's like a
disefficiency of scale.
Someone just felt
like it just wasn't
important to the core business.
"Well, then why did you buy it?"
and so on.
And eventually it was just
starved off and vanished down
the memory hole.
And so one of the ways
that you make firms better
is by making the amount of
harm that they can do less.
Because then the bad things
they do aren't as important.
And one of the
ways that you make
the harm that they
can do less is
by mandating that
they be smaller.
And historically,
any time we had
an industry that was
dominated by a few firms,
or any time a firm tried
to buy a competitor,
or any time a firm tried to buy
people in its vertical supply
chain, to dominate its
vertical supply chain,
we looked askance at that.
We subjected it to
very close scrutiny,
and we often blocked it.
Sometimes we overblocked it.
There's an argument that one
of the ways that we got here
was that there was a
constituency for this story
that antitrust had been
overused because it had been,
and they were willing to
hear arguments for why
it should be dialed down.
But I think that--
so I have a friend who
went to work for Facebook,
and then he quit a year
later and came to talk to me
and said, "you know,
you were right.
I didn't like it very much.
And one of the things that
I realized when I got there
that maybe makes me sleep
better is that no one there
is any smarter than I am.
They're just as dumb as I am.
They're not superheroes."
And I'm like, but
of course they are.
They're just like you and me.
And the reason that Facebook
is a bad custodian of 2 billion
people's social lives is because
nobody is a good custodian of 2
billion people's social lives!
How do you minimize
the harm that Facebook
has in the toxic ways that it
enables people's social lives?
Don't make it in charge of 2
billion people's social lives.
Break it up.
Make it into different pieces.
Make it sell off Instagram.
Make it split out the
two functions it has.
One is helping you find
people to talk with,
and the other one is
helping you talk with them.
Make it split those
into two pieces,
because it really sucks at
letting you talk to people.
And it's really good at helping
you find people to talk to,
whether those are
people who want
to carry tiki torches to
Charlottesville or people
who want to form a Little
League with you or whatever.
It's really good at finding
those people, people who have
the same rare disease as you.
It's really good at that.
It just sucks as a place to
carry on the conversation
afterwards, mostly because
there's not a lot of engagement
to be had if you
have a rare disease.
You check in everyday.
Things are OK.
Things aren't OK.
There isn't
blockbuster news that
keeps you hanging out on
the rare disease message
board all day.
So to up your engagement
level, Facebook's disefficiency
of scale is, they
take people who've
gathered to talk
about a rare disease,
and they throw
click bait at them
so that they stay
engaged, because their KPI
and their bonus is on
engagement minutes,
not whether or not
you're successfully
managing your rare disease.
Make them split those
two functions up,
you solve the problem.
Now you have a business that
just gets monotonically better
at helping you find
people to talk with,
and a business
that rises or sinks
on its ability to get you
to talk with them there.
And then you limit
a lot of the harm.
AUDIENCE: But as a bit of a
counterexample, so at Google,
for example, the
primary income is ads.
CORY DOCTOROW: Sure.
AUDIENCE: Was historically,
now it's changing.
But that has resulted in
things like Gmail, and Docs,
and Drive essentially
operating at a loss,
and being subsidized
by the successful parts
of the business.
CORY DOCTOROW: Yeah, but Google
is not a break even function,
right?
You probably have some stock
options as an employee.
So you've noticed
that the company
pays dividends, and also
declares a profit every year.
And so that tells you that
even if the ad business were
curtailed by a breakup, it
would still not necessarily mean
that the company was unable
to run those other loss
leaders, right?
It just might mean that the
shareholders took a haircut.
And there are lots of formal
definitions of corruption,
but one of the formal
definitions of corruption
is when you have privatized
gains and socialized losses.
It's cheap for me to pollute.
It's expensive for you
to get the pollution out
of your tap water.
But that expense is
diffused across everyone
who's putting filters
on their tap water,
and the gains are
concentrated in my hands.
Well, if the costs of
surveillance, which are real,
are widely diffused and
the gains are concentrated,
it may be that making
the firms internalize
some of those costs, dial
down some of the other things
that they can do.
But it also reduces this
drag that the rest of us
are feeling.
So maybe with the surplus that
we gain from being lifted out
of the costs of surveillance,
or market domination,
or all the other things
that come as a result of it,
that that surplus can
be allocated to make up
for the losses that we get.
It's totally true
that iPhone locked
ecosystems allow us to
get some benefits that
would be eroded if we unlocked
that ecosystem by force
majeure.
But I'm willing to
make that trade.
It's totally true
that our printers
are cheaper because our
inkjet cartridges are designed
to charge us more than Veuve
Clicquot for water and tone,
water and pigment, right?
And I'm willing to roll
the dice on that one
and find out what
happens if it turns out
that we no longer
charge for carbon toner
as though it were
plutonium toner.
AUDIENCE: Do you have
an eyeball on your sock?
I was trying to figure
out what that is.
CORY DOCTOROW: Oh, no,
it's "Clockwork Orange."
AUDIENCE: Oh, OK.
CORY DOCTOROW: Yeah, yeah.
We're at like peak
bookstore sock.
So I don't know if you've
been into a bookstore,
but like that's where
all the margins are now.
You talk about ad subsidy,
the entire literary world
is being subsidized by
the fact that we can now
programmatically map bit maps on
to socks using weaving machines
in China.
And I went on a couple of book
tours in the last two years.
I'm about to go on another one.
I spend a lot of time
in indie bookstores,
and I have all the socks.
AUDIENCE: Are there
Doctorow socks?
CORY DOCTOROW: No,
someone should make those.
And then I can sue them
for my right of publicity.
[LAUGHTER]
AUDIENCE: Thank you.
CORY DOCTOROW: Thank you.
SPEAKER: And if you
like socks, there's
a great sock store not far from
here near Angel City Books.
CORY DOCTOROW: Oh, yeah, that's
in Venice Beach Sock District.
SPEAKER: It's in
the old Elks lodge.
CORY DOCTOROW: Yeah.
SPEAKER: All right.
Thanks again, Corey.
CORY DOCTOROW: Thanks, guys.
[APPLAUSE]
