[MUSIC PLAYING]
I actually believe that the
work we're doing in this class
is essential to
creating and bringing
about the kind of democracy
and kind of society
that we want to live in.
The class is called Computers,
Ethics, and Public Policy.
And the idea of the
course is to take
three different disciplinary
perspectives of an engineer,
of a social scientist,
and of an ethicist
and integrate these different
disciplinary perspectives.
To engage Stanford
undergraduates
in thinking seriously
about new technologies
and their economic and
social impacts on society.
We're seeing the
consequences of technology
that was built a few years ago
have much broader ramifications
in our society, the
results of our elections,
the kinds of information that
gets shared and how it gets
shared, autonomous vehicles,
autonomous weapons.
The kinds of decisions that
happen every day in people's
lives that are ruled by
algorithms, whether or not
they know it or not.
These are all issues that have
large societal implications.
We're at a moment where
a conversation is needed.
And for that
conversation to work,
you need people who speak
each other's language,
people who understand
the technology,
but also understand
the political, social,
and economic issues
that are being raised.
So what we tried
to do was come up
with a set of different
experimental educational
opportunities ranging from
interactive case study
discussions to debates
to coding assignments
to philosophy papers
that would stimulate
different ways of
thinking about a set
of technological frontiers.
There was a module on the power
of the great platforms on which
so much of our online
lives are conducted,
the Facebooks, the Twitters,
the Googles of the world.
If we take a very basic
optimization mechanism
that a lot of
social networks use,
which is to try to show
people materials that they're
more likely to click on
or more likely to read,
very quickly, you
bifurcate the network
into the left-leaning users
and the right-leaning users.
And they're no longer
reading each other's stuff.
But all you were trying
to do was do something
like optimize click-through
rate or revenue.
What did you actually do?
You created political
polarization.
The exercise that
we had students do
asked them to inhabit
the role of an engineer
at a company where
they had the power
to decide how the
algorithm should work
with a trade-off between
increasing polarization
amongst the people
on the platform
and increasing monetization by
feeding the things that they
might wish to see.
In some sense, their
most important role,
regardless of what career they
choose, is that of citizen.
And it's as citizens
where we are
going to make decisions about
how to govern technology going
forward.
Technological work
is hugely important.
It's had a profound impact on
our society, on our politics,
our relationships
with other people,
and even our views of ourselves.
And I want students who engage
in the endeavor of building
technology to think
more broadly about what
are the implications
of the things
that they are developing.
How do they impact other people?
I think we'll all be better off.
For more, please visit
us at stanford.edu.
