DANA HAN-KLEIN: Hi, everyone.
Welcome to Talks at Google.
We're very excited today
to welcome Alex Garland.
[APPLAUSE]
He is the director
of the-- depending
on when you see this-- upcoming
film "Ex Machina," which
was awesome.
We screened it yesterday.
And we are very excited
to have you here,
because technology and AI
are something that we're
kind of interested in.
ALEX GARLAND: Yes.
DANA HAN-KLEIN:
Just a little bit.
I should probably have said
"Welcome to Blue Book,"
but a lot of people here
haven't seen it yet.
So let's talk a little bit
about first, your background.
You started mostly as a
writer and then producer,
and director of some things.
ALEX GARLAND: Yeah.
No, I'm a writer.
DANA HAN-KLEIN: You're a writer.
So what was it about
this subject matter
in this film that
made you go, this
is the one I want to direct?
ALEX GARLAND: Oh.
Nothing.
DANA HAN-KLEIN: All right, then.
ALEX GARLAND: It was that I
keep having this conversation.
And I realize why, I guess.
And it's because we dignify
the role of a director,
and it seems like
a really big deal.
I've been working film for
about 15 years, a lot of it
with the same group of people.
To me, this is just
part of a continuum.
It's just another film with the
same group of people, largely.
And there was no really
significant difference
between this, and
the one before,
and the one before that.
And it's because the
question stems from the fact
that we overstate the
role of the director.
That's what I would say.
DANA HAN-KLEIN: But at
the same time, I think,
depending on the relationship
you have with someone
who's directing
your own material,
you have a lot of
trust in them to--
ALEX GARLAND: No.
DANA HAN-KLEIN: --be
true to whatever
your original intention was.
And that applies the editor
and the actors, as well.
But by being both the
writer and director,
you do have an extra
level of control.
ALEX GARLAND: But now, you've
shifted the presupposing
onto the other
directors, and assuming I
gave them that level of
control, if you see what I mean.
And really, what it is,
the key thing you said
is that the editor, the DOP,
the actors, film is broadly
a collaborative process.
Having come from books, which
is not a collaborative process,
for the most part,
I can state film
is, I think it would
be fair to say.
And all the people in that film
have a real and significant
role.
And it's not just paying
lip service to it.
They really, really do.
That's why production
companies fight like crazy
over who the DOP is or the
production designer is.
If it was just observing
the vision of the director,
what difference?
DANA HAN-KLEIN: Fair enough.
OK.
Well, then, what were
some of your influences
for this particular story,
both cinematically and, maybe,
visually?
The history of artificial
intelligence in cinema
has definitely been
something that's
been portrayed many
different ways.
ALEX GARLAND: Yes.
DANA HAN-KLEIN: So what
were some of the things
that you drew from?
ALEX GARLAND: Well,
although it may not
look like it from the
stories and zombie movies
and stuff I've
worked on previously,
I try to draw as much from
real-life observation as
possible.
I had got involved with
a long-running argument--
a good-natured argument,
but still an argument--
with a neuroscientist
friend of mine
who comes from that very
respectable position that
exists within the theory
of mind about humans,
but also AI research, which
basically says machines
are never going to be conscious.
There's something particular
about human consciousness
that hasn't been understood yet.
When we do understand
it, we will
see why machines are precluded
from ever being sentient.
And I think on an instinctive
level, I disagreed with him.
And then, we argued over years.
And mainly, this film
comes out of that argument.
I finally came
across a book written
by the professor of
cognitive robotics
at Imperial, which is
like our version of MIT.
And it was about
the relationship
between consciousness
and embodiment.
And he has a really beautiful,
elegant argument in it
which combats, I think
quite effectively,
some of the arguments
I would hear.
And while I was
reading that book--
and I have to stress, very
hard for me to read that book.
I could only read
sections of it.
It's like mountain
climbing for me.
But while I was reading that
book, the idea for this film
arrived.
DANA HAN-KLEIN: So
you definitely delve
into some of the headier
issues of what is humanity?
What is personality?
What is a sentient creature?
And so what were some of
the challenges of getting
this heady topic material,
subject matter-- and it
has a lot of scientific
implications--
to translate to a
cinematic story,
as well as something that
is also a human story,
and not getting too bogged
down in the science,
but still conveying--
it was a challenge.
You picked a big challenge
for your subject matter.
ALEX GARLAND: Yeah.
It was.
Yeah.
I guess there was a
challenge involved.
I was just trying
to be, I was trying
to be fair to the
subject matter,
I guess, and tried to
be respectful about it,
and tried really
try to understand it
within my own limitations
as best as I could
before writing it.
Partly, because I
think it's interesting.
And it's an area
where-- though this
is true of lots
of area of science
at the moment, where
there's a kind of increasing
vacuum between the people who
are actually doing this stuff
and the rest of us who are
trying to understand it.
So some of it's about trying to
bridge that vacuum as best as
possible.
But it's really
tough to do that.
The things about it is, is
that any look at strong AI
also becomes a look at
human consciousness.
The two are related.
The problems of one are
the problems of the other,
to an extent.
Only to an extent.
I don't personally think that
when a strong AI arrives,
it will necessarily be
like us in many ways.
It may be completely different
from us in almost all ways.
But the issues and
the conversations
in this current state of not
understanding are very related.
DANA HAN-KLEIN: Let's talk
about the more human factors.
You had a really wonderful cast.
I thought that Alicia?
ALEX GARLAND: Alicia, yes.
DANA HAN-KLEIN: She brought
just this really great quality
of inhuman but human to
the character of Ava.
And I'm sure that's exactly what
you wanted her to bring to it.
Were there things
that you worked on
to get her to telegraph
that human but not?
ALEX GARLAND: She
arrived with it.
I'd seen her in other movies.
And we had various
conversations and stuff.
But when she arrived,
she had this idea.
She's a ballerina by training.
So she has a terrific
control over her physicality,
as well as being a
very, very good actor.
DANA HAN-KLEIN: Yes.
She was very graceful.
ALEX GARLAND: She's
very graceful.
DANA HAN-KLEIN: Almost
unsettling in a character way.
ALEX GARLAND: Well, exactly.
So do you know the VFX?
You know the VFX, because
you used to work in VFX.
DANA HAN-KLEIN: Yes.
ALEX GARLAND: Uncanny valley.
DANA HAN-KLEIN: Yes.
ALEX GARLAND: So what she did
was a kind of uncanny valley
version of human motion.
She said, I'm not going
to act like a robot.
What I'm going to
do is human actions,
but I'm going to
do them perfectly.
And that perfection will
create a sense of otherness.
Because when I'm sat on
this chair-- actually,
you're sort of graceful.
I'm not.
And I slouch.
And if I get up, I
have to shove myself
up and combat my middle age
and all that kind of stuff.
So whereas she does it with
this amazing dancer's poise.
And it's hard to
look at it and say
anything she does is sort
of, quote unquote, wrong.
But together, it creates
this uncanny valley,
slightly unnatural quality.
DANA HAN-KLEIN: Yes.
So you brought up
the uncanny valley.
I was almost afraid,
seeing the trailers.
All the footage was beautiful.
I was like, OK, I don't think
they're going to hit it.
And you didn't fall into
that, which was amazing.
But it's a challenge,
especially considering
the anatomy of
the main character
for the majority of the film.
We see her in this
robotic hybrid synthetic,
with a human face.
And it's a little unsettling.
At times, she's just
so perfect-looking.
But you managed to still
capture a human-ish element,
so it didn't hit the too
cartoonish and too humanoid.
Yeah.
Congratulations on that.
ALEX GARLAND: Thanks, sort of.
The funny thing
is about people is
that our urge is to project
humanlike qualities into almost
everything.
It's almost an effort
to stop it happening.
There is this incredibly
beautiful VFX,
and there's beautiful
production design,
and it's beautifully shot,
and it's beautifully lit,
and all those kinds of things.
But that thing of making this
strange machine, believing that
has the qualities
that we have, that's
actually really quite easy.
That is to say, you can
take it with a child.
It's probably quite
hard to find children
who don't attribute sentience
to their cuddly toys,
right, in some ways?
And that's where,
I guess, it begins.
Well, in fact, it
probably begins
because it's semi coded
into us in some respects.
I was driving not that
long ago with an adult who
was feeling pissed off with
their car and critical of it,
but didn't want to say
so in front of the car,
in case it hurt the
car's feelings, right?
Now, that's a grown-up.
And cars don't look much
like us, more or less.
So yeah.
I know you're being
kind about the film.
So I'm not trying
to be combative.
DANA HAN-KLEIN: It's your film.
ALEX GARLAND: It's a collective.
DANA HAN-KLEIN:
I think there are
a lot of times when we see these
representations of humanity
but not humanity, and you
really want to project onto it.
And what you had to do is
combat that a little bit.
A lot of the plot of it
hinges on us accepting
this artificial element to it.
ALEX GARLAND: True, true.
So initially what you do is, in
a very up-front way, you say,
this is a machine.
And there's no ambiguity.
It's a machine.
And there's all sorts of things
to force you to think that way.
There's cavities where
there shouldn't be cavities.
And you can see
through her in a way
that removes her being like
us, a girl in a suit, say.
And then, the task of
the film is to gradually
have that machine
quality fall away,
even though she looks the same.
And to show just by behavior,
we will forget the thing
that's right in front of us.
DANA HAN-KLEIN: So the VFX team
behind this did an amazing job.
It was fantastic.
And a shout-out to them.
Can you talk about working
with them, especially having,
basically, one of your
protagonist characters
be mostly CG for a good
portion of the movie?
ALEX GARLAND: Yeah.
Well, they were run, it's
a company DNeg, or Double
Negative, which are based in
London, where I live and work.
And our team was run by a
guy called Andrew Whitehurst.
And in the course of my
life, every now and then,
I've met some really,
really smart people.
And I think he may be the
smartest guy I've ever met.
He may or may not.
But he's certainly going
to be in the top tier.
The thing about him
that's so interesting
is he's got a real gift for
poetry, in a way, the beauty,
and things like that.
So I remember one
day he said, look,
I've got this great
idea, which is
to hang plastic strips inside
her form which will diffuse
the light in a
particular kind of way,
and make her slightly
more mysterious,
even though you can
see the machinery.
And so that's a
lovely poetic idea.
And then another day, we ran
into this massive quality
control issue.
It's kind of
technical, and I will
fail to explain it properly.
But basically, we were
getting, because of the way
the images on the Sony cameras
we'd use were being processed,
or something-- I don't
know-- we were getting
these weird stray pixels.
And in big flat areas of
color, suddenly there'd
be these stray pixels.
And we had a problem,
because we were
going to fail a quality control
thing to do the distributor
when we handed the film over.
And he said, don't
worry, I'll fix that.
He wrote some piece of
code over, like, a week.
And it reprocessed
all of the imagery.
And it fixed it and
made it go away.
So I'm thinking, I don't
know how to talk to this guy
any more.
He's just got a
range of abilities.
It's just unreal.
DANA HAN-KLEIN:
That's impressive.
ALEX GARLAND: Yes, it is.
DANA HAN-KLEIN: What
sort of mixture of it
was working with that team,
and working with the actors,
in terms of integrating
this character?
Because if it had gone
wrong, we would've
been lost as an audience.
You would have been
snapped out of it.
And it would have, probably,
hit that weird uncanny valley--
ALEX GARLAND: In the wrong way.
DANA HAN KLEIN:
In the wrong way.
Yes.
ALEX GARLAND: Yeah.
My approach to filmmaking-- and
my approach very much played
out in this film--
is about-- this
is going to sound like
corporate speak, all right?
But it's about these
different departments
having a lot of communication
between each other,
and also a lot of autonomy.
And that would
include the actors.
And it would include the
VFX team, and the composers,
and so on.
So it's not separate group.
It's roughly like
what you'd imagine
anarchy would be like ideally,
where you have autonomous
groups, not chucking bricks, but
agreeing about a common goal.
I think that's the key, is
agreeing about a common goal.
And then, having a
lot of independence
and autonomy within it.
And any department
you could mention,
that would've been the approach.
DANA HAN-KLEIN: Well, speaking
of capturing the subject
matter, that, and working with
technically challenging things,
a lot of your subject
matter is this futuristic,
I want to say a little bit
apocalyptic, in terms of, like,
"28 Days Later" and
"Sunshine" was--
We have to restart the
Sun, and apocalypse.
They're very challenging
things, that to accomplish,
I think, visually,
you need to have
a lot of trust in the teams
that are taking these to task.
Has the challenge of
it ever influenced you?
Have you ever thought, oh,
maybe, I want to do this?
Or you just go with
the creative process,
and trust your teams
to make it work?
ALEX GARLAND: Well,
you've got to aim high.
And I think one of the things
is that broadly speaking,
if people feel like
they're aiming high,
and they're having
to work slightly
outside of their comfort
zone, they raise their game.
And that's just what happens.
And the key is the
vibe, in a way.
It's like the atmosphere
of it, where--
it's very easy,
with collaboration,
to work with lip service.
You say, yeah, yeah,
we're collaborative.
Just would you mind
doing it this way?
And then we'll be really
collaborative slightly later.
And so on.
But I think if you
almost enforce it,
provided you've got the
right people, it works great.
I have to say, I've
worked on films
where it's been just the
most miserable experience.
And it's been
completely horrible.
It is down to the people
you're working with.
The collaboration
can turn into mush
really easily if you're
work with a-- no,
I'm going to stop
talking, actually.
DANA HAN-KLEIN: Well, it's
who want to be in the trenches
with, when it comes down to
editing and stuff like that.
Who is going to be a positive
influence on the creativity, as
opposed to just--
ALEX GARLAND: Just decent,
typically, is the main thing.
DANA HAN-KLEIN:
That's a good metric.
Well, what are some of the
experiences that you've
had previously that you brought
to this filmmaking experience,
that you thought really--
it sounds like you've worked
with some of the team before.
ALEX GARLAND: Oh, yeah.
DANA HAN-KLEIN: What were some
of the things that you thought,
I really need to bring
this to the table,
when approaching this film.
ALEX GARLAND: Well,
actually, a lot of it,
I don't know if in a way
it's of interest to anyone.
But I'd worked in
science fiction that
would've been stemmed
originally from some science
conceit like entropy, death,
or something like that.
And had sacrificed at a certain
point anything resembling
rationality for story
concerns about adrenaline,
or perceived story
concerns about adrenaline.
And what I really felt my
main goal with this film was,
I may never get
to do this again,
for all sorts of reasons.
I may never get
to do this again.
So I'm just going to try as
in my sphere of this table
with many legs, to
just do it right.
To do it as right as I can.
Not compromise on
anything at all.
Literally nothing in my sphere.
And so, I guess, that
was the main thing.
Film has to terrific
influences to compromise
built into the fabric
of how they get made
and how they get
distributed and financed
and all sorts of things.
In this instance, it was
to step away from that.
DANA HAN-KLEIN: It's great
that you were able to do that.
I think that's something that
a lot of directors strive for.
And then, when it comes
down to making the decision,
they end up having to
compromise through whatever
set of circumstances.
ALEX GARLAND: They might be the
guys requiring the compromise.
DANA HAN-KLEIN: That's
also very possible.
And actually, speaking of the
collaboration element of it,
can you talk about
working with Rob
Harty, your cinematographer?
Because the film was
very beautifully shot.
And I think it just all
tied visually together
with this beautiful
artificial creature, and then
this beautiful setting.
How did you work with him?
ALEX GARLAND: Rob's
just an artist.
It's very simple.
It's quite interesting.
You can put any
camera in his hands,
and he will just find a
way to frame the thing.
And it's like not knowing
how to play guitar.
I guess.
If you don't know
how he's doing it,
but you can see how good he is.
It's truly mysterious.
There's something
really weird about it.
He's just a very,
very talented DOP.
And he's known in the
scene, I guess, for
being very good anyway.
And he and I had been
close to working together
before on a movie, "Never Let
Me Go," about four years ago.
And he was one of DOPs in
the frame for that film.
But it didn't work out,
for various reasons.
But he was just the
perfect choice for this,
because he's sensual and warm.
And sci-fi can be quite
antiseptic and clinical.
It goes to the hospital
end of the sci-fi.
And his instinct is about
almost like dropping
a gauze over everything,
and being soft and Zen-like.
He was perfect for it.
DANA HAN-KLEIN: I think
Zen is a great adjective
to describe the
setting of the film.
It's this beautiful
retreat, woods
and waterfalls and rivers.
And it's somewhere
I'd want to vacation.
ALEX GARLAND: You can.
It's available to rent.
DANA HAN-KLEIN: Good to know.
It might have extra
guests in it, still.
ALEX GARLAND: It's a hotel.
DANA HAN-KLEIN: Oh.
ALEX GARLAND: We shot
in a hotel in Norway.
DANA HAN-KLEIN: Wow.
ALEX GARLAND: It's a
really cool eco hotel.
And the landscape is
absolutely stunning.
And one of the funny
things about film
is it both loves and
hates familiarity.
And finding a landscape has
not been well-used in film,
it's quite difficult.
I've noticed there's been
tons of movies recently
that have used
Iceland as a location.
Once you get zoned in
to that Iceland vibe,
you think, oh, we're
back here again.
You think it consciously
or unconsciously, actually.
And Norway had something
special about it.
There's a kind of
bleakness in it.
It's beautiful
and very majestic.
Stunning skies and mountains
and waterfalls, as you said.
But there's something
a bit hard, too.
Bit bleak.
DANA HAN-KLEIN: I'm
going to assume it
was a pointed juxtaposition,
the nature factor
with the unnatural elements.
Maybe not unnatural.
Next evolution elements.
ALEX GARLAND: Highly,
highly contained,
controlled environment,
created by a man.
Completely uncontrolled
environment created by no one.
And those two-- arguably.
DANA HAN-KLEIN: Yeah.
ALEX GARLAND: And our
thoughts in detail.
DANA HAN-KLEIN: What, speaking
of the control element,
everything did feel
very-- even plotwise
and behaviorwise the
characters-- everything
felt very planned.
In a good way, keeping the
audience on their toes.
And who is in charge of what?
Who's making the decisions?
Who is planning?
Who's puppetmaster,
in a lot of ways?
And was it tough to keep
the audience's trust
for some of that?
And say, no, go with me,
you'll figure out well.
Or did you want to keep
people a little disoriented
the whole time?
ALEX GARLAND:
There's a funny thing
about film, which is that
you can assume literacy
on the part of the viewer.
That's not necessary
true in books.
You could write a book which
alludes to "Heart of Darkness,"
but you would not
be able to assume
they'd read Conrad and
read "Heart of Darkness."
Whereas with a film viewer,
you can pretty much assume,
for example, they've
seen "Blade Runner."
Right?
And in fact, you might
even be able to assume
they've seen "Apocalypse
Now," which is based on "Heart
of Darkness" in some respects.
But so that's this funny thing
about the way film works.
And it's a free gift in terms
of what you're talking about,
in terms of that relationship
with the audience.
Because some of these things
have been well established.
And then, you can use
them to your advantage.
So in the case of this film,
a smart, literate film viewer
is very quickly--
almost immediately--
going to be thinking,
she's not the robot, he is.
It's just going to be
an automatic assumption.
They're just going to do it.
And then what you can
do is use misdirection.
And nudge them a little
bit further towards that,
because he's got oddly
symmetrical scars
on his back, which
has a justification
within the narrative
about a car crash.
But you're thinking,
yeah, that's not why.
And so you use that
stuff to your advantage.
It's very useful.
But the basic
function of this film
is it's intending to set
up a series of questions.
Some of the questions
it's setting up--
like where does gender
reside, for example--
it doesn't then necessarily
have an answer to it.
And, in fact, it
might be saying it
is impossible to present
a clear answer to some
of these questions about
consciousness or gender or AI
or whatever it happens to
be at this moment in time.
We're not able to do that.
But that doesn't negate a
reason to ask the question
and have the conversation.
And it's basically
an ideas movie.
It's to provoke conversation.
And to, hopefully, do it in a
respectful and thoughtful way,
which would be the intention.
DANA HAN-KLEIN: Yeah.
I think it absolutely
accomplished that.
And you mentioned the
scars on the back.
And that was one of the
things that immediately
after the movie I
was talking about.
Wow, I wonder if it was
part of something else?
Or are we meant to
think he's a robot?
But it's exactly where
you wanted the audience
to go with it, which is--
ALEX GARLAND: Misdirection.
DANA HAN-KLEIN: Right.
But it still helped frame--
I'm trying to talk about
without spoiling
anything-- it helps
frame later events in the
film, and create this tension
until the end, which-- and
I won't say exactly what
happens in the end.
But the ending was
just fantastic.
ALEX GARLAND: Cool.
Thanks.
DANA HAN-KLEIN: Thank you for
not giving us what we expected.
I think it kept true
to the feeling and plot
you'd set up prior
to the very ending.
And I think if you'd
gone the cliche route,
we would have been a
little disappointed.
ALEX GARLAND: You'd have pushed
me off the train or something.
DANA HAN-KLEIN: No.
It still would have
been satisfying.
But I think the way that--
oh, can I just say it?
ALEX GARLAND: Say what you like.
I'm not a censor.
I'm going to spoiler
alert warning here.
No, at the end--
SPEAKER: [INAUDIBLE].
DANA HAN-KLEIN: Oh, yeah.
Sorry.
But the way that the robot--
ALEX GARLAND: Maybe don't.
DANA HAN-KLEIN: Sorry?
ALEX GARLAND: Maybe don't.
DANA HAN-KLEIN: Maybe I won't.
ALEX GARLAND: He's shaking
his head, saying, don't.
DANA HAN-KLEIN: Oh.
Never mind.
I won't spoiler alert.
It's great.
Go see it.
OK.
Either way, the
ending is fantastic.
ALEX GARLAND: Thank you.
DANA HAN-KLEIN: And
it's unexpected.
And I think it
really does service
to your audience in
trusting them and saying,
here are all the points, and I
trusted you come up with this.
And now, I will reward that with
keeping true to the characters.
ALEX GARLAND: Thank you.
DANA HAN-KLEIN: Sorry
for almost spoiling it.
ALEX GARLAND: No.
DANA HAN-KLEIN: I want to talk
a little more about the actors.
What was the process?
Because it's very limited cast.
Domhnall Gleeson was
great and charming
in this almost innocent way.
ALEX GARLAND: I'd worked
with him twice before.
So I knew Domhnall would
be really good for this.
And actually, just simply
called him up one day and said,
hey, I'm going to
send you a script.
Will you do it?
Obviously, if he didn't want
to do it, I couldn't make him.
Try as I might.
So that was Domhnall.
And then with the other
two, the thing about actors
is-- and I really think
this is a fair statement--
is that there's no mystery to
good actors when you see it.
You just know it
when you see it.
It's hard to find somebody
who would say Philip Seymour
Hoffman was a bad actor.
He was evidently an
incredibly good actor.
That tends to be the
case with good ones.
You don't need to be an expert.
This film required the
casting of good actors.
There is a kind of actor who
isn't necessarily a good actor,
but they have an enormous
amount of charisma.
And that can be fantastic
for certain kinds of films.
Charisma's just what you
want to make it function.
But this particular
film, they needed
to be actors of a
certain type, really.
And so that was the pool
we were drawing form.
Oscar Isaac, I'd seen him
in lots of stuff, actually.
And he's one of those guys
who, within the industry,
is incredibly buzzy, and
very, very well respected.
And Alicia Vikander,
I'd seen her
in a Danish movie
called "A Royal Affair."
I'm going to guess she was in
her early '20s, or maybe even
late teens, when she shot that.
She's acting opposite
Mats Michelson, who's
a fantastically experienced,
charismatic, and terrific actor
as well.
And despite that, your
eye just goes to her.
She's got that magnetic
ability that some people have.
And so she was to perfect.
That was, essentially,
the root, I suppose.
DANA HAN-KLEIN: I think she
did such a lovely job of--
ALEX GARLAND: Yeah,
she's extraordinary.
DANA HAN-KLEIN: --shouldering
this interesting hybrid
character.
ALEX GARLAND: They all are.
We were dead lucky
with that cast.
DANA HAN-KLEIN: And going
back to Oscar Isaac--
ALEX GARLAND: The dancer.
DANA HAN-KLEIN: Oh, really?
Yes, the dancing was just so
unexpected and phenomenal.
But he had a sort of
charisma of his own.
And not necessarily
big, boisterous
Hollywood personality charisma.
But the character itself had a
very interesting, draws you in,
makes you want to find
out what he's doing.
And I think that was sort of
integral to the plot of it,
was having this guy who makes
you want to find out more.
And is that part of the
reason you cast him?
Or is that something
you saw in him?
Or is that something that
came out later in the process?
ALEX GARLAND: No, no, no.
That was the intention
of the story.
I He's a tricky
thing, because he's
the CEO of a big tech company.
And immediately, that
leads people to think
he's aimed at somebody.
But it wasn't
really that at all.
In a way, I was thinking more
about people like Oppenheimer.
Because AI, we
know, because we're
informed by Stephen
Hawkins and Elon Musk,
and people like that,
has latent potential
for being extremely dangerous.
And I think that has to be true.
It is potentially dangerous.
But I also think it's reasonably
analogous with nuclear power,
which is also dangerous, but
doesn't stop us using it.
And I'm basically
in favor of AI.
I think it's terrific.
And I'm fascinated by it.
And all power to it.
Well, maybe not all, but
a lot of power to it.
So the thing with
Oscar's character
was to be a kind of
Oppenheimer-like character
who's conflicted about what
he's doing as he's doing it.
And it's up to the
audience, in a sense,
to understand where
he's really coming from.
Because he presents
himself as being
incredibly misogynistic and
predatory, and then actually
violent.
And then, you have
to decide, is this
an act he's putting on for the
purposes of this experiment?
Or is it real?
Or is it something in
himself that he's helplessly
amplifying for the experiment?
There's all sorts of
things that it could be.
And Oscar's a very liquid actor.
And that kind of
set of challenges
of where are you at
this moment in a scene
is exactly what kind of
turns him on, really,
and gets him going.
DANA HAN-KLEIN: I think
it's also a testament
to the writing of the character
and the portrayal by the actor.
But it was one of the
more complex characters
that I didn't know if I
wanted to identify with him
or support him or just be
like, whoa, you're horrible.
ALEX GARLAND: Well,
you're invited,
clearly to think
that he's horrible.
There's several stages of
the film where it is almost
instructing the
viewer to, say, feel
deeply suspicious and
uncomfortable with this guy.
However, there's another
thing as well-- or at least
I hope there is-- which is
that sometimes the things he's
saying, even though he
sounds like he's wrong,
are actually true.
And so it's the ability
to hear past what
something sounds like it
is to actually what it is.
DANA HAN-KLEIN: I think the
vehicle that's delivering it,
as opposed to the
actual message.
He's an interesting
vehicle to deliver.
ALEX GARLAND: Portray, yeah.
DANA HAN-KLEIN:
So you say you're
in favor of artificial
intelligence, to a degree.
ALEX GARLAND: Yeah.
By which I mean strong AI.
AI conflates so many
different things.
There's AI in phones
and video games.
And we're talking
about strong AI.
DANA HAN-KLEIN: Right.
ALEX GARLAND: To use that term.
General AI.
Whatever.
DANA HAN-KLEIN: What was it
like working with the science
advisors on this?
Just because you mentioned
this conversation,
ongoing discussion,
that led to a movie
with your neuroscience friend.
I'm going to go ahead
and guess that there
were a lot of science advisors
involved, because the--
ALEX GARLAND: There were
three particular advisor.
There was a lady
called Gia Malinovich,
and there was a
geneticist who also fronts
a radio show, actually, which
is trying to disseminate
scientific discussion
on a Radio 4 BBC thing
called Adam Rutherford.
But there was this
guy, Murray Shanahan,
and he works at Imperial.
And he was the guy-- all
of these people, what
I said is, look, I've
attempted to write
this thing as best as
I can understand it
from the literature and from
YouTube videos of lectures
and stuff like this.
Check it.
Be really, really tough on it.
If it seems wrong, if this is
an inaccurate representation
of Mary in the black
and white room,
or whatever it happens to
be, as a thought experiment
or whatever, tell me.
Now, obviously, it's got
to big conceits in it.
There's a robot that
has a level of robotics
that clearly doesn't exist.
DANA HAN-KLEIN: Yet.
ALEX GARLAND: Yeah, sure.
Yeah.
But doesn't.
And there's also a
machine that really
does seem to be sentient.
And that also doesn't exist.
And probably yet.
So there's a limit to
how much you can advise
on something which is fiction.
But to be reasonable about
the subject was the key thing,
I think.
DANA HAN-KLEIN: And have
you applied the same thing
to your past works?
They've all sort of--
ALEX GARLAND: No.
No, I haven't.
I haven't.
And that's why I
did it with this,
because I felt like I had
let down the subject matter
previously.
And so when I'm talking about
earlier, when we were chatting
about not compromising,
that's basically
what I'm talking about.
I feel frustrated with
some of that stuff.
So yeah.
DANA HAN-KLEIN:
But I think it puts
you in a tough position, where
you might be given-- if they'd
come back said
[SKEPTICAL NOISE],
XYZ doesn't work
because of this.
It puts you in a tough
position, where you might have
to compromise on your own work.
ALEX GARLAND: No, because
then I wouldn't have done it.
But the key was you
make it cheaply.
If you want creative freedom,
make it for less money.
DANA HAN-KLEIN: Oh, I meant
plotwise, and stuff like that.
ALEX GARLAND: So do I.
DANA HAN-KLEIN: OK.
All right.
All right.
Never mind, then.
ALEX GARLAND: In any
terms you can think of,
the less money you make it for,
the more freedom you've got.
DANA HAN-KLEIN: Yeah.
So what other subjects
would you like
to delve into going forward?
What are some of the other
things that interest you?
ALEX GARLAND: Well,
I'm trying to-- I've
finished a script of a really
fascinating novel called
"Annihilation," written by a
guy called Jeff VanderMeer.
So I've tried to adapt that.
And I'm going to wait.
I'm in the process, now, while
I'm out here in the States,
of trying to set that up.
And, hopefully, we'll succeed.
And I came across this
very interesting argument
that talked about how all life
on the planet is cellular.
And there's an
argument which has,
actually, some evidence
attached to it,
incredibly, which is
all of those cells
are derived from one cell.
Which actually makes
logical sense, when
you stop to think about it.
But I never had stopped
to think about it.
And that is a truly
extraordinary idea.
That's one of the reasons
I like science and I
like science fiction,
is because I think
it puts these really
fundamental, fascinating ideas
into your head.
And so, I guess, that's what
I'm fixated on at the moment.
DANA HAN-KLEIN: That's a
very interesting subject.
You approach these
large macro things
that make us, I
feel like, examine
what it is to be human, almost.
ALEX GARLAND: That's what
science does, isn't it.
DANA HAN-KLEIN: Yeah.
ALEX GARLAND: Makes you think
about the future and the past,
and where you are in it.
DANA HAN-KLEIN: But
there's an approach
to it, also, that
is looking at it
from the human standpoint
on it, as opposed
to just a purely scientific.
Because there's a way
to tell those stories,
and explain like, oh, well,
we came from one cell,
as opposed to being like,
think about the ramifications
of this.
And exploring that, as
opposed to presenting
scientific evidence.
ALEX GARLAND: Yeah.
DANA HAN-KLEIN: Well, I'll
ask what's one piece of advice
that you wish you could
have given yourself
before coming into the project?
ALEX GARLAND: What, this one?
DANA HAN-KLEIN: Yes.
And then, going
into your next one.
ALEX GARLAND: Nothing.
DANA HAN-KLEIN: Nothing?
ALEX GARLAND: It was all cool.
It was great.
It was a good bunch of people.
And it worked out the
way it was supposed to.
I've never been able
to say that before.
But I can say it about this.
DANA HAN-KLEIN: What's
one thing in particular
that you draw from
this experience
that you'd like to
bring into the next one?
Is it the not compromising?
Is it--
ALEX GARLAND: Work
with nice people.
DANA HAN-KLEIN: Work
with nice people.
I think that's a great thing.
I think--
ALEX GARLAND: We cool?
DANA HAN-KLEIN: We cool.
Thank you so much
for joining us.
"Ex Machina" will
be out in theaters.
Catch it.
Thanks so much.
ALEX GARLAND: Many thanks.
Cheers.
Thanks a lot.
