Thank you. Hope everyone can hear me. Yeah,
so, I want to take a minute and thank Selenium
for inviting me to speak about ethics because
it's not something that many conferences do.
So, let's give them a round of applause.
[ Applause ]
It's one of the most important topics of our
time. And we don't spent enough time thinking
or even discussing it. It's actually so urgent
that only last week you can see what happened
like Google contractors supposedly targeted
homeless people for facial recognition. And
the recordings of Mark Zuckerberg trying to
get into the course with the US government
if they try to break up Facebook. So, it's
real a difficult thing to talk about and solve.
But we need to start talking about it.
But most importantly, I want to make this
talk because being unethical is starting to
become the norm in this industry. And if you
think about it, this is beyond dangerous.
I'm not here to talk to you about a new framework
or Selenium or anything else. There are other
folks who can to that better than me. I'm
here to talk to you about how we messed up,
how we messed up really hard, and what we
can do to fix it.
So, here's the current state of ethics in
tech through a single GIF. And here is how
the tech companies think about being ethical.
So, we need to change that. But how did we
get here? How did we get from a point where
we started building products to help people's
lives and improve people's lives to stealing
data, to using evil tactics to sell data to
third parties among other things?
Since I started researching about this topic
and presenting at the conference, I found
30 million results about unethical companies
on Google. But something more, 90 plus incidents
happened only this the previous year around
unethical practice of companies and users'
data. And this is a big number. But let's
take a look at some of the events. Here is
like a couple of events I tried to find from
even last year. But the thing is that every
week something new will come up even worse
than the previous week. And already all these
examples are already outdated because new
things came up and I couldn't keep up with
the pace which is devastating.
But for a start, I want to break up this incident
into three different and big categories. The
first one is technology and election interference.
So, you may have heard of Cambridge Analytica.
Cambridge Analytica used people's data to
influence elections. And I say elections because
it happened in the US in 2016. And it happened
on Brexit using fake news and misinformation
to people.
But you can find misinformation in tech news
in a lot of different forms like articles,
edited videos, deep fake videos and click
bait. And they usually target a lot of these
people with ads using all of these tactics.
Now, the second category is Brexit. Brexit
is deeply connected to Cambridge Analytica
and what they did with the US election as
well. Brexit happened inside Facebook. If
you don't know what I mean, I highly recommend
watching a talk reported from the Guardian.
And since reporting Facebook and Cambridge
Analytica and election interference for the
last couple of years and how this is connected.
I highly recommend that. What I said happened
in Brexit inside of Facebook is a quote from
her talk. And it struck me right away because
it was so true. And we couldn't see it. It
was in front of our eyes and we couldn't see
it.
What do I mean? Brexit happened inside Facebook.
Facebook sends you ads. Everyone has their
personal news feed. Nothing is searchable.
No one can see what you are seeing. So, if
you see an ad and you pass it on the newsfeed,
no one even reports it. Other people can see
the same thing. So, that's that's an evil
tactic that companies use like for personalized
experiences to serve you with things that
many times are not real. Except with that,
you cannot even have a log of all of the ads
you are seeing on Facebook. So, that's why
everything happens on Facebook, stays on Facebook.
Nothing is permanent.
The second category I want to talk about,
and many people maybe you will ask me for
it, because with all the devices and the connected
phones, many people are super happy they can
say a word, and something happens. Like the
lights are on and off. But I don't trust these
companies. And I did a little research about
why they're doing what they are doing. So,
they are trying to build all of these devices
and sell it to people for mostly advertising
purposes so they can know more about you and
they can serve you with better ads and better
product.
But they are doing that in a way that's not
familiar to you. Through voice and we land
on surveillance. So, what they do is, believe
it or not, they are hearing you all the time.
They're not hearing you just when you say,
hey, Siri, hey, Alexa, they're doing it all
the time. And they get the data up to their
servers and they're doing surveillance. If
you don't believe me, you can believe me because
they already said that's happening.
So, Google Assistant, Amazon, Apple are already
listening to all of our conversations. If
you have any of these devices at home. And
the thing I want to focus on is how they do
it. How they have all of these devices connected
in our homes and what they are getting from
these patterns. So, what they do or what they
want is for you to have like your life and
they can take your daily patterns like what
time do you wake up? What time do you go to
sleep? What time do you leave for work and
stack it all up and serve you with better
ads. And have all this information and sell
it to third parties or to keep in their own
server. So, they do that through pattern.
And one of the scariest things is they recently,
they started or will start using machine learning
to analyze your voice. If you are sick and
saying something to someone, crying or having
a depressing episode, they will serve you
with ads about antidepressants and all of
these things. Which is pretty scary because
you don't know what's happening. And that's
actually what's happening. That's the scariest
thing.
Another thing that struck me during my research
is that they are planning to move into buildings
and into apartments before you do. So, they
are working with them and contractors to build
Alexas into their apartments. So, yeah. I
don't need find that ethical, to be honest.
And the worst thing is that all of this is
invisible. The next tech revolution is going
to be invisible. Part of, sorry, is the smart
home. It's already happening. Yeah. Be careful
to who and where you share your data with.
The third type of case I want to mention is
the Volkswagen emissions scandal. So, where
they build a system to trick the US officials
to let them ship cars based on their emissions,
that they were under a certain percentage.
And they get caught. But the thing is that
one of the engineers who worked on this product,
he got into jail and the company gets out
of it with just fines and nothing more. Which
is pretty scary because now you know that
your actions have an impact and you cannot
get away with it so easily.
As you can see, and I hope you do, we have
a problem. We have a really serious problem.
And we need to start analyzing what's happening
and what we can do to fix it. Because we sued
and we need to do that.
So, we're going to break the problem into
four, again, big categories that will make
all sense at the end. So, I'm going focus
on the first one first. I want to talk about
call serve. It's a big thing, you are building
it in your company as well as building the
product at the same time. Whatever decisions
you make about your product, that's building
your culture as well. Let's talk about Silicon
Valley. Let's talk about this mythical place
where all the unicorns are born. Unicorns
don't exist. So, you can make the correlation
there. And all of this great stuff is happening
in this bubble. And they all want to disrupt
the market and create Uber for X, Facebook
for Y and I don't know what. And they're not
thinking about the consequences their actions
have into people's lives, most importantly.
They're moving fast and break things. And
this is the worst kind of, like, culture thing.
Every company adopted from Facebook. Because
Facebook, it is what it is today because of
this kind of mantra. They moved fast and broke
a lot of things. They are providing us with
misinformation, fake news. They are helping
people at the government. Again, serve you
with not so many real things in ads. And now,
again, on Facebook, I want to talk about this
quote or mantra. Again, the future is private.
Mark Zuckerberg presented that at the latest.
And he said it with such passion and confidence
that for a second I thought that he believed
it. So, I want to talk about this message
and what they want to do for the future of
family member and decompose it to actually
see what they want to say to us. Why they
do want to merge their platforms. What they
want to do is to merge Facebook, Instagram
and WhatsApp all in one so it's like the same
system and it's not three didn't companies.
Actually, that's the case now.
So, it's going to be harder for regulations
to break them up. And, again, how do they
plan to encrypt our messages? Because encryption
and merging the platforms was the two strong
messages from the last decade. And I want
to take a minute and decompose for a start
the second message. How they are planning
to encrypt your messages. Someone will send
you a message. It could be Facebook, or it
could be a shortcut. Scans the message for
malicious content. What they are actually
doing is they are scanning your message so
they can serve you advertisements afterwards.
So, I had this conversation with one of my
friends where she wanted to start learning
Swift, the Swift programming language. And
she sent me a message about that, so I replied
with recommendations. And after she left Instagram
and opened Facebook, she had like a sweet
programming book advertisement in her newsfeed.
So, I don't think they scan the messages for
malicious content.
And then Facebook, again, will encrypt your
message. So, they will still going to track
you and serve you with advertisements. But
they are going to encrypt the messages at
the end. I don't feel that's what encryption
means. So, the last thing about the culture
is that many of these companies are only focused
on the positives. What they are going to do
in the world. So, they are leaving behind
all the things that can go wrong or how people
are going misuse their platforms, misuse their
services to harm others or, yeah. Do whatever
they want to do. Now, the second part of the
problem is deeply selected with culture. Anyone
have any idea what that may be?
Okay. It's leadership. Leadership is deeply
connected to culture. And leadership can influence
culture in a good or a bad way. Leaders are
quick to recognize what they are doing right
while being oblivious to the things causing
them to fail. And, yeah. If you think about
it, that's really true. And if you take into
account the news about how, for example, Mark
Zuckerberg said, for example, that Facebook
didn't have any way to influence an election,
afterwards, after all the facts and the information
came to light, he changed his opinion on that.
But I want to take a minute and see some examples
of really bad leadership. Imagine that you
are the CTO of one of the biggest social networks
in the world and you get something like that
when your company saves the passwords of their
users into plain text. So, we are saving this
information to help people make an informed
decision about their account security. We
didn't have to. But believed it was the right
thing to do. We didn't have to? I cannot imagine
that even out loud. So, no. They they didn't
have to. They are obligated to do it. Because
they are in a position in which they are because
of people's data. Nothing more.
Needless to say, that after 30 minutes, that
was deleted. But I was fast enough to screenshot
it. And another case that I want to serve
with you today is how Jack Dorsey makes one
misstep after another without thinking about
it. Like, it's crazy. He says all of these
things. How he tries to make Twitter a healthy
platform, which it's not. It's bring it out.
It's not a healthy platform. And you cannot
have like half the conversations without being
targeted by a lot of different people. Yeah,
the CTO Jack Dorsey says the company was probably
way too aggressive in banning right wing activists.
I don't think that's a good thing to say.
But two months ago, the new VP of design on
Twitter said that he was disappointed that
many of his staff, like the people who designed
the new web version of Twitter were targeted
and harassed by people about their work. But,
wait. That's what's happening on the rest
of the people who are trying to have, like,
a conversation or something on Twitter. So,
yeah. That's what's happening in Twitter in
a nutshell.
And my response to that is, maybe that's what
it takes for Twitter to start taking harassment
seriously. Because until now, they don't do
enough to stop harassment on their platforms.
So, maybe that's the Tweet or this behavior
of people harassing the Twitter staff will
kick off a conversation in the company.
The thing about leadership that I mentioned
before is that all of these leaders and CEOs
and executives have, like, their head in the
sand. They really don't know what's really
happening. And they don't want to take action
on important stuff. The third area of the
problem we're going to focus on is regulation.
Well, this is a big topic. And I can even
create a separate talk for regulation. But
here is how I feel about regulation. I'm searching
for it. I cannot find it. And that's the case
because we move too slow to even have a conversation
about regulation. And in a nutshell, regulators
don't know how tech companies work. One of
the most famous questions from Mark Zuckerberg
hearing in Congress was if you don't charge
money for Facebook, how do you make money?
And the answer was, Senator, we run ads.
So, you can see that many people regulating
tech, they don't know a thing about how tech
works. Okay. You will say that the $5 billion
fine to Facebook from FTC was a big thing
and Facebook got punished and it's good. It's
not. I want to take the time to decompose
what that fine means for Facebook. And here's
a pretty good Tweet to explain it to you.
Even though I don't agree with Alex Stamos
on the first start of his Tweet. The real
threat to the tech giants and competition,
not regulation. And everybody is really missing
what happened today.
Yeah, the threat is regulation as well. That's
why every tech company is trying to run away
from regulation. But I want you all to focus
on the second part of the Tweet. Facebook
paid the F TC5 billion for a letter that says
you never again have to create mechanisms
that could facilitate competition. So, what
that means? So, you may go to Facebook and
say, I want access to your graph. Facebook
will say, FTC won't let us. I am you may say,
I'm trying to create data, and I want to use
Facebook data.
And Facebook will say, the FTC won't let us
because of that letter. And the conversation
will go on and on. That's really what the
5 billion fine meant for Facebook and us.
But I want to focus on a really positive thing.
I know, I'm really down today. But I want
to focus on a positive thing about the fine.
So, now Facebook really needs to think about
all of these evil practices that they are
using throughout the company. For example,
one of the most ugliest tactics that I ever
saw is to verify your Facebook account by
using your email and your email's password.
And I cannot believe that this product passed
through managers, engineers, designers, product
managers and no one said nothing about it.
Which is pretty crazy.
And one of the personal stories for me is
that when I tried to enable two factor authentication
on my account and I explicitly said that I
don't want to use the SMS option. And I want
to get my password through 1Password, they
enabled S MS verification as well and as soon
as I realized it, I shut it off. But, yeah.
It's weird tactics, to say the least.
The fourth area I want to talk about, because
we already talked about culture, leadership
and regulation. I want to bring it to us and
mention uneducated public. Because really
the public don't know how the Internet works
and how tech companies behind the scenes work
and how available their data is to these companies.
And they really don't understand how valuable,
again, their data is for companies like Twitter
and Facebook.
In addition to all of that, people don't know
how to keep themselves secure on the Internet.
And we end up with using the same passwords
for every account kind of mentality. Which
I'm trying to make my mother stop doing that.
Which is pretty hard to say the least. And,
again, we come up with statements like, who
cares about my data? Well, a lot of people
care about your data. And a lot of companies
are, like, billion or trillion dollar companies
because of your data. So, your data is really
important.
And the ultimate result is to value convenience
more than privacy. And that goes back to also
a smart home thing that I mentioned before
where we, again, value convenience over privacy.
And I want to take a minute and, again, tell
is story about how easy it is for hackers
or anyone to get into your apartment through
these devices. From cameras to thermostats.
And get access to your whole network. Because
between the device 
and you, there is nothing that does the security.
But what happens when people know what's at
stake and they take action? After Cambridge
Analytica, 25% of US citizens turned ad blocking
on their browsers on their phones. So, I believe
if people are informed about what's happening
and what's at stake, they take action.
Again, I came up with the question: What we
could do to change that narrative and start
fixing this problem? Because it's a huge problem
and we don't seem to take it seriously. So,
we can keep ignoring it, or we can work together
to solve this problem. Again, I want to focus
on four different areas for providing a solution
with the first one being leadership.
What's ethical leadership? Ethical leaders
need to take a stand and say what's right
and do what's right in front of others or
when they are alone. They don't they need
to intervene and take action when something
unethical is happening. So, they need to lead
by example. Ethical leaders need to lead by
example. Need to define and align with their
own values and care for themselves so they
can be able to care for others.
Again, let's focus on the first one. Lead
my example. Leading my example is the best
way to ensure an ethical business. They need
to promote good conduct no matter what. Leaders
need to be willing to intervene informally,
to steer behavior in their organizations and
resolve emerging problems.
Second thing, define and align your values.
We don't if you are a manager or if you are
a leader of any kind, you don't want people
who do not share your values. You do not want
people who are lying, who are getting behind
your back trying to trick you. So, the personal
moral credibility of leaders can be very important
in enhancing the effectiveness of ethics regulation.
So, hire people with similar values. It's
that simple.
Promote open communication. Be as transparent
as possible. Trust can get you really far.
And with trust you can work more openly and
honestly with your coworkers. be aware of
bias. You can find bias everywhere, either
in the real world or in the digital world.
How we are developing machine learning for
products to how we treat employees at work.
No tolerance for ethical violations. And the
third one, care for yourself so you're able
to care for others. Take your vacation. Have
a steady schedule at work 9 to 5. After 5,
don't expect to receive any emails or make
that apparent to your boss or manager. So,
everyone there can really copy you in a good
way and really try to be like you in things
like that.
So, yeah. Take care of yourself and stay hydrated.
The third area of the solution I want to talk
about is an engineer's code of ethics. In
everything we do, we follow a code. But in
engineering, that's not what's happening.
We need to set a universal set of rules. I
don't want to call them rules. But they are
rules. So, the first rule, be responsible
for the work you put into the world. Value
impact over form. Accept full responsibility
for your own work. So, if you something unethical,
trying to steal or read people's messages,
data, anything, you should expect to be punished.
Ethics in university. Again, this goes back
to how we want the new generation to learn
and to learn how to treat engineering in a
more general form. What I mean is that when
I was at university, I took an ML class, a
machine learning class, and I had a pretty
interesting conversation with one of my professors
where in the introduction of the course he
didn't mention once about all the bad outcomes
that machine learning can have to people's
lives and how machine learning can impact
people's lives in a not so good way.
So, he mentioned how good machine learning
is and what we can do to improve people's
lives. But again, he missed the point. And
after having a conversation with him, he didn't
change his mind, of course. But, yeah. That
was my observation from that conversation.
Regulation. Again. It's a really important
topic. And the people who are going to regulate
tech, they need to get advice from outside
people that they are working on tech. And
now they are like a lot of people who want
to help regulators regulate tech. So, they
need to get advice from them. They also need
to have a change of mindset of how to think
about regulation in tech. Sorry. They need
to not set anything in stone. To be highly
adaptable and for the laws to be highly adaptable.
Because tech is moving so fast and regulation
needs to move as fast.
And the last area I want to focus on as a
solution is to educate the public. Educate
them how to be more secure in the Internet.
What it means for tech companies to own their
data. Because tech companies own people's
data now, whatever they say. And you are the
product if you are using anything free on
the Internet.
Data has become, without any of us noticing,
this set of almost secret currency of the
modern economy. So, really people need to
take that into account and then take the action
to say, no more. Privacy is a fundamental
human right. I didn't say that, Tim Cook said
that. And I think that you all agree that's
a pretty bold statement. We have a duty to
the humans on the other end of the screen
to build a product that they can love and
trust. Thank you.
