- Ethical dilemmas
never occur in a vacuum.
We simply respond to
what happened before us.
(dramatic music)
My name is Bert Ballard, I
teach communication ethics.
I'm an Associate Professor
of Communications
at Pepperdine University.
In an ethical dilemma, you're always
having to engage in trade-offs.
In some ways we call that an aporia,
a tension between two rights.
So, what I'm going to be looking for is
not just in terms of what are
the moral dilemmas and choices
but how does an artificial
intelligence and an android
navigate these kinds of
moral dilemmas and questions
and concerns for others within
the context of the game.
And then an even deeper question,
is android or an artificial intelligence
actually a moral agent?
I'm excited to play it.
Let's do it!
- Please, please, you've
got to save my little girl.
Wait, you're sending an android?
- All right, ma'am, we need to go.
- You can't do that. You-
Why aren't you sending a real person?
- What's defined as real?
Maybe it's because I am a professor,
but I am choosing to
analyze the situation.
So, at this point I am still trying
to understand what happened.
It sounds like the child was
close to the android and-
but there's a victim, someone is deceased,
it looks like a police officer
who was the first responder.
I'm still trying to understand
a little more of what happened.
Stories often provide
context for understanding
why particular actions are taken.
It doesn't necessarily mean
they did the right thing
with those actions,
but it does help us understand why
a particular sequence of events occurred.
Ethical dilemmas never occur in a vacuum,
they occur because of
some series of events.
One of the things we say in communication
is we're always responding
to something else.
The idea that we created
something out of nothing is false.
We simply respond to
what happened before us.
I want to understand why in this case
would a deviant respond by
grabbing a gun out of a closet,
holding a child hostage and
then shooting a police officer.
- Stay back! Don't come
any closer or I'll jump.
- No, no! Please, I'm begging you!
- Go, go, go!
- Hi, Daniel. My name is Connor.
- How do you know my name?
- Oh, great, you use somebody's name
and you humanize them.
He's probably feeling kind of threatened,
so try to calm him down just a little bit.
- Are you armed?
- Oooh. Tell the truth.
- Yes. I have a gun.
- Drop it. No sudden moves or I'll shoot.
- If he knows that I'm telling the truth,
the probability hopefully will go up.
- I know you and Emma were very close.
- That's right, see if we can come back.
Try to come back to the relationship,
rather than blaming him as an individual.
Of course through this whole
thing I'm interpolating
the notion that this
is a real moral agent.
- I never wanted this.
I love them. You know?
- He's relaxing, because I
appealed to his sense of nature
Ooh, now he's not very happy.
- Let the girl go and I
promise you won't be hurt.
- Remember the goal is to save the child,
which is the consequence, and
I'm using that relationship
in order to save that child.
He may feel a betrayal of trust
but the child will be saved.
Of course in all of these
games there's always a twist,
so I'm wondering what will happen here.
(gun fires)
Somebody always has to
violate it, don't they?
So Kant, one of the very famous
philosophers Emmanuel Kant,
one of his big things was
you never ever tell a lie,
it's always about truth.
And for him, truth was a
principle you never violate.
Within the context of
communication ethics,
when you lie, you get rid of trust.
But I don't tend to think
of things in terms of lying,
I tend to think of things
in terms of deception,
and deception is about not
always telling everything
that you might know, but in this case
Kant's wisdom is right on.
If we were to create trust
even before this incident
would the deviant Daniel in
this case have acted that way?
Trust was also what, within
the context of the video game,
what saved the child.
It's clearly, and obviously
it's a video game,
so it's an artificial world
and clearly plot contrivances
and things to drive it,
but no, I think it does a good job
of forcing you to think
through those kinds of things.
Looks like it is private property,
trespassers will be prosecuted,
and the question is, it could
be uncomfortable but safe,
and how do you get in there?
Open 24/7, can't stay.
Uncomfortable, but discreet.
All right, so it looks like
I have multiple options.
I want to go to the
laundromat, to get warm.
Oh, so we can find a
change of clothes, I see.
Yeah, so I'm thinking
about taking the clothes,
but the trade-off is you've got somebody
who is cold and wet.
- What are you doing?
They're not our clothes.
- Oh, there's the moral conscience.
In this particular society,
you're marked as an android,
so you are not seen as a full human being.
All right, okay Alice,
you corrected me, for now.
In this particular case,
communication serves to correct me
in my ethical dilemma and concern.
And sometimes that's how
communication ethics works.
We have others who can
help us find a better path.
All right, we're leaving.
Reminds me a little bit
of, a very classic case
that they often use for ethics
called the Heinz dilemma.
Where a husband's wife is dying,
the drug costs a lot of money,
does he break into the drugstore or not.
And it's something we often use
in introductory ethics classes
to get students thinking about,
do you steal the drug from the drugstore,
or do you let your wife die?
And so at this point, I'm
trying to hedge my bets
that she will be okay, if
I can continue to search
for other possibilities.
I'm gonna try the house.
Nah, you know what?
Think I'm going to try
something different.
We're going to go back
and steal some clothes,
she may not like it, but I need
to not look like an android.
Hang in there, Alice, your
life of crime is beginning.
Steal.
Take it, take it, take it.
Oh, man.
Oops!
I wouldn't have done that in real life.
Well, no longer an option.
Dang!
At this point, I know I
can get back to the car,
I don't want to try the house.
What we try to avoid often in real life,
is how we do ethics, we don't
want to deal with that tension
around the ethics, we
just want to implement.
(laughs)
All right, we'll use Alice.
She's cold, hungry and desperate.
That's up to circumstances.
- Knock them down, but why?
- Please, Alice, just trust me.
- She has to live on
the unethical wild-side.
I don't know why she's knocking them down.
Oh, I know why.
(cans clattering)
Whoa, whoa, whoa.
- [Store Clerk] Are you all right?
Are you hurt?
Well don't worry,
it's just a few cans.
- Yeah!
- Come on, let's go.
- Uh oh.
Now we're in trouble.
Got away.
The ethical decision in this case
is you're desperate, in some sense,
in my own self-justification,
the clerk didn't offer
a whole lot of pity.
Yes, I may be an android,
but this is a little girl.
So I'm going to need to
use what resources I can
to be able to get what I need
to take care of this girl.
What's interesting is our
choices create our realities.
You make a choice and it leads
to another set of choices.
And in this case, I've constrained myself,
because I screwed up stealing the clothes,
so the choices aren't choices
that I'm going to be able
to follow through on anymore.
If only we had do-overs.
Just hit the reset button.
You can do that in video games,
you can't do that in life.
It's often hard that,
because you make decisions,
just like the decision tree that is shown
at the end of the flowchart,
once you get on a particular path,
it can be hard to turn that around.
It doesn't mean we can't,
but we often start to make
small little decisions that can lead
to larger ethical violations later on.
Which is what I'm illustrating right now,
I've tried to steal clothes,
I'm now cutting property,
I stole food, I stole money,
and in my mind I'm justifying,
saying, "well this child
needs a safe place to go."
And I continue to close off my options.
So I'm getting a little bit frustrated,
because I didn't do it , quote, "right"
in terms of the game.
Now, I've broken a lot
of ethical boundaries,
but we have food and money.
Let's see if I can get into this house.
I gotta find her first.
- Wait, what are you doing?
- Visitors
- Look, I'm an android too.
You have nothing to worry about.
- This is clearly a paranoid android.
And definitely not something I foresaw.
Now I've at least saved the little girl.
Clearly there's something
in this whole storyline
about androids feeling
marginalized and objectified.
There's reason to stay,
because he's pretty paranoid
and I've gained his trust,
so hopefully he will at least
be protective of the two of us.
Ah, there we go.
- Why didn't he ever love me?
You'll never leave me right?
Promise you'll never go!
- I promise
- I promise.
- Will we be together forever?
- Yes, we'll be together forever.
- Forever.
- I've got 50 bucks.
Oo, emotion.
Can androids show emotion?
They can in this video game.
The experience was fun.
It's neat to be able to
see these moral dilemmas
mapped out in a much more clear way.
I would say it's 80% realistic.
The choices you're going to have to make
under those circumstances
are true to life.
Having to make the choices about
how to build connection
with the hostage taker, yes.
Having to choose to sometimes
do things you don't want to,
such as stealing, shoplifting, diverting,
breaking and entering,
in order to get shelter
to take care of a child
is definitely true.
What isn't true, because
it is linear and a game,
you're forced into certain
choices, whereas you might
make different kinds of
choices in different ways.
There's lots of different
ways to navigate out of these
and there's almost an infinite number
of possibilities and resources available,
and that can't be captured in a game.
The choices that we make,
and the way that we reflect on them,
impact, not just ourselves, but others.
And the more we consider that,
in the way that we make our choices,
hopefully the better
world that we can create,
the better relationships that
we can create in our lives.
