 
I don't know how we follow that really
but we'll try
welcome everybody to cog X I hope you're
wrapped up well here we go we're gonna
open the ethics stage so everyone here
probably knows that data sets are biased
everyone here probably knows that
gendering voice assistance has
consequences everyone here probably
knows that facial recognition can be
discriminatory everyone here probably
knows that there aren't enough minority
groups or women working in AI yet a live
person survey revealed that 53% of
respondents had never even thought about
Y voice assistants are projected as
female
although 85% of them did know that
female voices were the default and
recent element AI lab research
highlights that still only around 12
percent of AI researchers are women so
the gender and other discriminatory
effects of AI are not perhaps as widely
known as those of us here might think
but even where they are widely known
things have yet to be fixed we're not
here today however to rehearse these
problems we're here instead to tell you
what we think people like us humanities
and social science academics can help to
do about addressing them and we isn't
just tremie and I so back in February on
a much nicer day than this at the
University of Cambridge the leaving Hume
sent of the Future intelligence held a
workshop on AI and gender this was
convened with the Ada Lovelace Institute
and supported by PwC the workshop was
what we call transdisciplinary and trans
sectoral so it gathered scholars from a
whole range of academic disciplines and
it brought researchers and practitioners
from industry and research centers
outside of academia as well as key
figures in UK AI governance and policy
and over the course of the day 17 10
minute talks created a why
and detailed picture of the cutting edge
of current research and initiatives into
AI and gender this created lots of
agreement some disagreement a lot of
conversation and a very interesting and
quite phenomenal buzz so we wanted to
take advantage of that communal energy
at the end of the day when we invited
our participants to take part in a
collective intelligence exercise their
challenge was to come up with at least
three recommendations for new areas of
research concerning AI and gender the
result of that challenge our Lee became
Center of the future of intelligence
report AI and gender for proposals for
future research the reports currently in
press to be launched at CF i's annual
Maggie Bowden lecture on the 18th of
June but we're here today to give you
guys a sneak preview so the AI agenda
report is a collective enterprise and it
outlines four of the weightiest
challenges to gender equality presented
by recent developments in AI in tandem
and it outlines for academic research
proposals which we think would
effectively contribute to tackling these
issues the aim of the report is to be
informative but primarily it seats to be
provocative for academics research
funders the public and industry to act
on these issues our proposals are not
intended to be prescriptive but to serve
as a call to action to address AI and
injustice another important point to
raise is that although the report
focuses on gender rather than race
ethnicity or sexuality we recognize the
inseparability of these issues so the
report advocates for research to strive
to be intersectional pluralistic
interdisciplinary and trans sectoral
which is why it's so appropriate for us
to be telling you about it at an event
like cog X throughout the report we
acknowledge scholars organizations and
institutions which currently effectively
tackling particular issues and we aim to
keep all work in this fields
collaborative the ambition is that the
research proposals outlined in the
report
we'll be viewed as a tool which
incorporates and complement existing
work while highlighting the areas that
still require investigation in addition
although this work situates examples
mainly in the UK context we advocate
that research should be as international
as possible so each of the proposals
begins by summarizing the issues which
we're currently facing and then it lays
out kinds of research that's needed in
order to address these issues so we
indicate methods aims values and
challenges of the research so we want to
give you a kind of flavor of what these
research proposals consist of so the
first topic we've split it into four
research themes the first is what we've
called bridging gender theory and AI
practice so technological design often
captures and reproduces controlling and
restrict restrictive conceptions of
gender and race which are then
repetitively reinforced and it's very
interesting parallel between
technologies insistence on repeating
particular actions and the roots of
gender in a repetitive social
performance and these two things
reinforce the restrictive mechanisms of
the gender binary and of racial
hierarchies we look at three notable AI
systems or aspects of systems which
repetitively reproduce controlling and
restrictive conceptions of gender and
race humanoid robotics virtual personal
assistants and gendered epistemology
gendered knowledge so in order to
address these issues we propose that
research which uses gender theory and
this includes trans Siri non-binary
queer feminist and post-colonial theory
to explore the fundamental barriers to
equality embedded in the design and
purpose of AI technologies and we note
that although feminist theories has
already been applied to technological
practice it has been criticized by trans
writers for ignorant of trans lives so
the use of gender theory needs to be
broadened there this research we suggest
would also include and pursue multi
lateral conversation with international
stakeholders technologists and designers
it would seek to understand the
conceptions and definitions of gender
and race that they use and why and how
those conceptions are being embedded
into technological design and we think
this is crucial because there's no point
us just going to them and telling them
what we think we need to listen to their
views in order to figure out how theory
can speak to practice bridging the gap
between gender theory and AI practice
would require then synthesizing these
two strands the theoretical work and the
communication work to produce
research-based practical tools these
would be employed and incorporated into
the way in which these technologies are
designed and used in society these tools
could inform the technological process
at all stages as well as the more
political aspects of technological
creation importantly we think this
research should also explore where
systems should not be deployed or where
they would be inappropriate in the
context of the aim of social justice so
law and policy is the second research
theme
Noren policies surrounding AI is
currently at the embryonic stage of
development just this month the
high-level expert group on AI think the
EU will put forward policy and
investment recommendations about how to
strengthen Europe's competitiveness in
AI and kind of picking up on this the
development of AI and is often
associated with economic growth and
intensified political power and there is
concern that these motivations will play
an underlying role in shaping laws and
policies on AI at the expense of other
more socially equalizing motivations and
there has been an abundance of work on
ethical codes which should inform our
technological practice by holding human
values at the heart of development but
there's been little work on how this
could be translated into policy and
legislation it goes without saying
though that of course these structures
will play an absolutely crucial role in
shaping how AI liked integrates into our
societies so the AI agenda report
proposes that there's a need for
research which analyzes existing and
emerging legislation and policy relating
to AI and gender specifically we suggest
this research could include
of policies surrounding data and privacy
technological design and labor those
three areas we suggest these areas of
policy are currently being worked on in
relation to AI more generally speaking
but would benefit from additional
gender-based research so laws and poly
policies we suggest could be analyzed
through two mechanisms the first is
legal gender theory this could be used
to consider how policy and legislation
can facilitate AI policy makers ii
interviews with technologists expert and
policy makers would function as a way to
gain mutual understanding between policy
makers and technologists regarding the
definitions of gender and how vulnerable
gender groups would be impacted by
certain structural changes this evidence
would inform the formulation of
research-based gender specific
recommendations regarding particular
policies additionally we call for a set
of guidelines for ongoing policy
development which would outline certain
standards to be upheld when designing
and implementing new policy and
legislation which both directly and
indirectly impacts issues surrounding AI
and gender equality so I opened and this
is our third strand bias datasets I open
with noting that datasets are often
unrepresentative of the public
demographic and there's a high level of
data deprivation when it comes to
capturing minority groups bias datasets
amplify gender and racial inequality and
project past and present biases into the
future this is what Caroline creado
Paris's recent book invisible women
refers to as the gender data gap
whiteness and maleness not only dominate
our datasets but they cause skews in
them an addition those datasets can also
be disproportionately targeted at
minority groups so our report proposes
that the collection handling and purpose
of large datasets needs to be further
explored and exposed with regard to how
it's perpetuating gender and racial bias
and discrimination and we do note in the
Challenger as challenges that there's a
there's an issue
access here which will need to be
negotiated ethical guidelines relating
to AI are often non-contact specific
premise Thanh some kind of
one-size-fits-all approach but we
suggest the need for research groups
results in context specific gender
specific guidelines for best practice
regarding data these contexts would
include we suggest crime and policing
technologies health technologies and
financial sector technologies these
three areas exhibit multiple instances
where datasets can result in significant
bias and where more work needs to be
done to enhance gender equality through
data equality guidelines would cover
data collection data handling and
subject specific trade-offs prior to
these setting these guidelines
the report suggests that there's another
theoretical aspect needed which is to
investigate and align definitions of
fairness so there's the definition of
fairness from the perspective of gender
equality but there's also technical
definitions of fairness and research
needs to be done specifically in
relation to historical and current
gender issues and tensions and to use
sources such as context specific studies
gender theory we're really getting the
challenge here aren't with the weather
and now the helicopter's gender theory
and the fairness tool ordered in
parallel to this research finally
there's a need for research which
analyzes the underlying societal issues
in relation to these data biases and
this is where the social science comes
in this would look to identify the root
causes of these issues for example why
are certain pockets of society not being
captured in current datasets or why are
particular industries collecting and
handling data in a discriminatory way so
our fourth and final research theme is
diversity in the AI workforce so
currently only 7% of students studying
computer science and 17% of those
working in technology in the UK are
female and the current pipeline doesn't
promise a better balance in the future
last year PwC did a survey of 2000 a
level students in the UK and only 3% of
this sample said they would pick
technology as their first
University which is no surprise
considering and he 6% had had it
suggested to them as a career option
diversification of the AR workforce will
be vital in order to design and
implement technology which is equitable
those involved in designing future
technologies are dictating and framing
how society functions diversity is
important because it brings to the table
deliberation and additionally a lack of
diversity exhibited by an unvarying
workforce alienates those who are not
consistent with this images at the
current rate existing inequalities will
only be aggravated and enlarged by an AI
labor market which fails to reflect a
diverse population this report advocates
that there is a need for search which
firstly investigates the psychological
factors surrounding diversity in STEM
education and the AI labor market this
word explore biases from both sides
looking at what factors motivate women
and minority groups to pursue these
subjects rakta patients and what biases
impede diversity in the industry through
things like application processes and
subconscious bias in the process of
recruitment the report suggests that
this could be done through a combination
of both qualitative and quantitative
research so that the data can be
analyzed to discover any causal
inferences between the main statistical
bottlenecks and the psychological or
cultural reasons for these barriers so
the quantitative data collection we
suggest could take place in schools
universities and workplaces to gather a
wide range of data concerning different
points the pipeline the qualitative data
we suggest to be collected through focus
groups and schools and universities and
this would allow students to kind of
share their experiences of subjects and
teaching and institutional culture and
we also suggest the use of surveys and
discourse analysis in organizations in
order to look at as I said things like
recruitment processes and promotion
systems and workplace culture secondly
we suggest that there's a need for
research which explores mechanisms to
embed a culture of diversity
and when it comes to eliminating bias
there tends to be a certain reliance
upon balancing numbers of people in the
room and we we note in this report that
although this is stéphanie important
research should also consider how to
create a sustainable culture of
diversity which can be embedded in
educational institutions and in
workplace thirdly and lastly in this
section we suggest as a need to order
the current initiatives which are aiming
to address this imbalance in education
and careers in stem and AI inspired by
the work of iris Burnett we argue for
the use of randomized control trials to
point to how these initiatives are
having the greatest impact and this
would also give an indication of how
interventions could be alter to optimize
their impact
so as we embark upon rapid technological
development now is the moment to address
these issues our report report
consolidates the aims of that AI agenda
workshop back in February it scopes and
situates current research and
interventions it identifies where
further research and interventions are
required it refines the nature of the
relationship between AI and gender and
it acts as a call to action to tackle
injustice or some of this at least can
be done by academics but we can't do
this alone and we can't do it without
collaboration or indeed without funding
we want to collaborate and that's why
being in spaces like cog X are so
important moving forwards to end though
we want to look backwards for a moment
in a BBC lecture in 1954 in which he
called attention to the grow global
threats of his day such as nuclear
weapons the cambridge philosopher
Bertrand Russell claimed all equally are
in peril and if the peril is understood
there is hope that we may collectively
avert it all equally are in peril at
least in relation to AI that's not quite
true
research into AI agenda into a iron race
is crucial in order to expose the ways
in which we are not all equally
Harold that's a very deceptive pronoun
we in 2015 the late Stephen Hawking used
it when commenting on AI he said the
real risk with AI is a malice but
competence a super-intelligent
AI will be extremely good at
accomplishing its goals and if those
goals aren't allowed aligned with ours
we are in trouble
who is this our who is this we and is it
really so clear what our goals might be
the real risk with AI to adapt Hawking's
comment is not malice nor even
competence but the fact that the we
whose data informs it the we who design
it that we who implement or regulate it
the our of our shared goals are all
implied to be universal but actually
denote a very small subset of the
world's population research into the
different gender defects of AI
technologies the reasons for them and
how to fix all this is urgently needed
if it's not just going to be some of us
who are going to be in peril in an AI
future
so we're gonna hand over now to my
colleague Dr. Kanta Dhall who is gonna
chair a panel discussion which will pick
up some of these themes and hopefully
expand and explore beyond them thank you
so I'm counted yell I'm a postdoctoral
researcher at the new firm Center for
the future of intelligence I lead the
research project global AI narratives
and decolonizing AI and as Sara said we
are going to be discussing some of those
issues just mentioned a bit more in
depth from the perspectives of different
sectors
I'm Gina Neff I'm a professor at the
Oxford internet Institute and the
Department of Sociology at Oxford and my
issue really is I'm starting to work on
how technology and AI influence work
workers and workplaces accord you need
the firm's f13 hi everyone my name is
Carly kind I am soon to be the director
of the ada lovelace Institute which is
the body dedicated to ensuring that AI
works for people and society so we've
just heard for research themes outlined
by Clemmy and Sarah and which of these
stands out to you as the most important
for your specific sector and why I want
to just take up and amplify something
that they emphasize in the report which
is AI has the capacity to repeat
perpetuate and introduce new kinds of
gender biases and I think that that
provides us in Sarah's charge an
opportunity to really think about what
kinds of values goals and objectives are
getting built into technologies today
that will really influence social
conditions tomorrow we should actually
contextualize where AI is going to
operate because we are placing it in
in organizational context there are far
more complex that we think they are so
when we talk about workforce we should
not only think about the technical
skills who should be thinking of
everyone who is helping shape those
technology from business analyst to
translators to be that the tester itself
and I think that that's you know looking
at the pipeline and the findings of the
report and the fact that the pipeline
takes time populate we have to be
patient we have to look at how can we
whose that diversity and other means not
just looking at the value chain and in
early education is there any way we can
address it with the existing workforce
do we have to you know prepare enough
skills the existing workforce for gender
related issues I think I think it's
important to have a bevel open mind and
I first look at boosting the workforce
in AI I thought Murray says I think
having women in the room at every stage
has to be at least a starting point and
acknowledging what Sarah and Clemmy said
it's not enough it's not sufficient but
it's essential prerequisite to stories
come to mind to demonstrate that the
first is in Caroline creator Paris's
book invisible women she talks about a
mathematician Dinah time Mina who was
the first to look at hyperbolic planes
and realize that she could demonstrate
what a hyperbolic plane looks like
through crochet which was something that
she had grown up doing and she opened up
the eyes of scientists and
mathematicians to think about how to
design and illustrate a hyperbolic plane
in a new way in an essentially a way
that only a woman can do because she had
the experience in a you know
traditionally feminine craft and it made
me think about Ada Lovelace herself the
namesake of our institution who looked
at charles babbage's analytical machine
and could see the could see the analogy
with the jacquard loom which was a
weaving machine and essentially a
universal machine that could be that
could run certain programs and again
only a feminine I are a woman who had
exposure to traditionally feminine discs
or tools and trades such as weaving
could look at that and understand it in
a new way so I think those two examples
really demonstrate why having a woman in
the room and a woman's vision and a
woman sight on everything that's being
designed and in lorem policy all aspects
of the the discourse is really essential
to opening up new ways of looking at
things as the report says and as Carly
just said having women in the room is
the start and it's not sufficient and
one of the things that I think is really
important about this report is it
reminds us that the stories we tell
about technology are really important
for shaping how we think about how the
technology can be used how it can be
challenged how it can be adopted and I
think this is so true even in these
stories as well if we if we think for
example the story about automation
saying that mainly male factory workers
will be put out of business that's a
great story it's a story that has
animated a lot of the conversation we
talked about in the future of work and
yet it's wrong the recent report from
the Office of National Statistics finds
that of the jobs at most risk for
automation 70% of them are held by women
and that's part of the kind of story
that we need to start telling we need we
need to we need to scope the problem
around the gender dynamics and AI not as
just we need more women to participate
in AI we need that but we also need to
have a much bigger conversation about
the kinds of organizations we want the
kinds of companies we want the kinds of
governance we want the kinds the kinds
of society we want cultures that account
for our needs as workers that we work
differently than men
then we should extend our interest
beyond data and models into processes
into cultures and norms to understand
what do we need to change to actually
foster that culture that's inclusive and
diverse because otherwise even if we
prepare workforce they're diverse when
they actually land a job in an
organization they might not perform to
the level we expect because the culture
doesn't help them right so on this note
I mean for instance I can't crochet or
weave and I have very little knowledge
of how a jacquard loom works perhaps we
should just teach all kids in school how
to crochet in case they become a themá--
tition x' but on that note simply again
putting a woman in the room will not
always solve all the problems we need
people in the room who are able to speak
to these problems who are preferably
experts on exactly these kinds of
problems that need to be addressed
because not every woman is a gender
expert now as Sarah and Clementine out
one of the issues is and this is
highlighted in a report as well
gender theorists so experts on gender
and AI practitioners don't talk to each
other enough now why do you think that
is the case and what could be done to
implement that more at every level I
think we have to remember that people
building and designing AI technologies
actually have their hands full right now
i mean they're they're working really
hard on building technical challenges
and that's not to absolve people
involved in industry and research from
the really hard and thorny ethical
problems but we can't get to better
technologies if we have very siloed
conversations where some people are seen
as doing the soft side of AI and some
people are seen as doing the really hard
by what I find in my work is that when
you open up the space of what's
considered design and what's considered
technology you start to find and see all
sorts of amazing stories so case in
point I have a brilliant now former
doctoral students she became she got her
PhD on Thursday night mm-hmm
Samantha story Sam shori and she did a
brilliant design workshop the Computer
History Museum in Silicon Valley where
she looked at the coding work that took
place to make the core memory programs
that navigated the Apollo missions to
the moon and these core memory this core
memory was woven core memory it was
literally woven wire with waft and waft
being ones and zeros and the engineers
designed the program the women they
called them the little old ladies
literally knit the first this massive
computer program this massive core
memory to power the Apollo missions so
what they did in their design workshop
they literally brought some of the the
engineers into the room and along with
the public along with historians and
they handed them cards and weaving just
like you might get from a child's
weaving kit and they said okay here's
your little bit of code now at ten times
the scale ten times larger see if you
can weave a computer program like the
quote unquote little old ladies did and
guess what they failed but they what
they learned in the process was that
that technical skill was just as
important as the mathematical components
so we've we've developed a set of
stories about what counts in our
technological high society you know high
advanced high tech society that some
kinds of skills count and others don't
building community that doesn't really
count even in an economy where we're
running things based on people in
communities content moderation content
takeout that doesn't count
even though Facebook can't survive
without that work and so I think one of
the things we need to do is have a
serious reflection on who's doing the
work where they're doing them work and
can we start to recode and recount some
of those types of jobs that women are
holding as part of what we need in our
technological society and I think we've
been discussing what has changed first
of all that we had this explosion of AI
adoption and a hundred million assistant
smart assistant being sold this year
alone so that means on one hand we have
the consumerization of AI and also on
the other hand we had the meter right
and then suddenly we get more
acknowledgment of the abuses the women
have suffered you know starting with a
the city of dreams and I mean this this
context that we actually go through a
period of more focus on those programs
for those programs of diversity and
inclusion that mean time to produce that
culture going back to what I said
earlier and I think it's on one hand is
acknowledging all the efforts being done
from board levels to middle management
on fostering diversity and actually
bringing the the new consideration on
agenda into those program there already
exists almost like piggybacking I love
this expression because I think it helps
us you know benefit from the energy and
the focus that's already there and
second very important is like is how do
we translate the great research you guys
are doing and how can we bring the
decision-maker to meet the researcher so
that we actually feed this very
important knowledge into action and you
know groups like CFI and other are the
best places to foster this dialogue to
make sure that we have that round and
diverse audience like we had for the
workshop in February and make sure that
we are connecting with people how to
move our own own
on the ground doing doing the hard work
just to pick up on one small thing which
you talked about board representation I
think we need to think also about
incentives what are the incentives for
AI companies in particular in the
private sector to bring women's voices
into the room
women founded companies received 2% of
venture capital funding what incentives
are there for those companies to
increase representation of women on the
boards within the team and amongst
founders as well you know research shows
that women founded companies received
the same levels of second-round venture
capital funding as non women founded
companies so there's equal virtue to the
products that they're creating but they
don't get that put in the door in the
first place and I think we need to
change the incentives and one way to do
that is following the money this reminds
me of a company which I maybe should not
name but which is competing with taxis
rather controversially who were looking
for a female CEO and the headline last
last week in one of the newspapers was
their hunt for a female CEO is down to a
shortlist of four men which was because
all of the female candidates that they
had approached had said no because they
did not feel incentivized enough to work
for this particular company as CEO I
think it's important to start empowering
women to be part of the decision-maker
and not just to diversity or you know
address diversity as a token is we have
to be honest if we are real about
achieving gender bias we have to start
with the board we have to start
empowering women to be at the table
where the big decisions are being made
and also to build
what you said we need measures Sony
Maria um this discussion has been going
for a couple of years now we have been
all alerted to the fact that there is a
leaky pipeline that there is lack of
diversity incentives have been started
lists of principals have been made
what has changed over these past few
years
what has improved if anything I think
that as I said before the fact that we
have now focused on diversity does that
just excellent we've seen a flurry of
new initiatives running it obviously but
for our without lines as well to address
diversity throughout the value chain so
we have a fantastic initiative I
couldn't be more proud to mention it
every single time is that she can which
is a text she can charter we have a
hundred organization that signs 280 has
three streams one addresses the policy
and we've found that a PD GA I own stem
the second is updating the curriculum
for schools but third which is very
important is role models we should not
underestimate the the role of role
models like all of us in the room and so
many fantastic women like just here hi
Jess to build our profiles and inspire
young women to step in and we have a
young generation that's waiting to be
foster to be giving the boys to believe
in the power so I think it's important
to acknowledge that as well but then we
started looking as I said from board
level word kpi's we need to have in
place and who is accountable for those
KPIs how do we measure that across the
value chain in recruitment in culture in
empowers women to speak do you actually
know that in meetings only 20 percent of
the times women are empowered to speak
in regular meetings we have to challenge
that we have to again go and say you
know give women a voice and those
initiatives have started to pay off
we've seen an increase in number numbers
of ferreting PwC from 30% 13% in 2003 to
20% now I know it's 7% but I think it's
a phenomenal increase and I think there
are many other examples that we're
looking at how do we look throughout the
value change and make sure that every
single part is address so that we have a
sustainable diversity on long term not
just small action that produces a bit of
impact now they are not consistently
design great that's encouraging Maria
said earlier this morning that we have
to move from gender ethics gender
diversity and ethical behavior being
simply something that's seen as extra in
business to being business as usual and
that's when we'll know that we've made
real change and so while these
initiatives are starts they they they
help companies organizations and even
society move to places where we can
start to change culture and change
attitudes so one initiative that
amplifies some of the work that we see
in corporate initiatives is a program
I'm involved in called the Women's Forum
they have an initiative on AI and gender
and the hope is that we can hold
companies accountable asking them to as
a start account for their own data
processes and practices around gender
and AI so can they actually do internal
audits and then share the best practices
and results with a group of other
partners so the partners in this
initiative includes some of the biggest
technology companies Microsoft and
Google but also UNESCO whose recent
report on gender and AI I think along
with the one that Sarah and her team are
releasing in two weeks really helped
pave the way for how we can start to
scope the problem of technology either
repeating and perpetuating these kinds
of things we want to change or being the
lever that we used
make that change so perhaps this is a
question that speaks for itself but I
have noticed that there are still a lot
of people who need convincing that this
is actually important rather than just
gender diversity making things look good
so why is it important for everyone that
there is gender parity diversity
inclusion and empowerment and what
specifically does that bring to
improving AI and its impact in its
technology well without being making
overblown claims I think we probably all
agree that AI is going to fundamentally
change almost every element of society
and I think that there's a real risk
that technologies can become rejected by
societies unless there is legitimacy for
their adoption and I think building
public legitimacy in a social license
for AI both in the private sector and in
the public sector requires buy-in from
people across society and half of those
people are women so I would have thought
it's as simple as that if you need to
make a business case for it I think you
can also do that you can say that you
know a particular product is not going
to have buy-in by women it's not going
to reflect needs of women and therefore
it's not going to you know find a place
in the market and you know if you need
to make that business case I think it's
quite evidence as well but if we think
about AI in public institutions I think
you know the success of AI at the
adoption of AI is going to depend
entirely on whether it has public
legitimacy amongst the public fifty
percent of whom are women and we have
the figures I mean we run a research
that diversity is not a problem but the
solution and we're actually demonstrated
with numbers that it was the value
creation innovation anything so it's
just someone needs to acknowledge in
Brian run away with it right it's as
simple as it is I don't think we
necessarily
need convincing we need to know think
about why we're building AI right then
we we we all believe that AI should be
built to help with flourishing of
humanity and humanity has 50% man 50%
women therefore if we want to everyone
to flourish we have to acknowledge the
needs of women as well so I think we are
on the right track but it's like 9
pregnant women cannot give birth to a
child in one month right I think we have
to give it time and actually really be
brave no necessary get scared when an
algorithm is biased towards men because
we need to understand how those solution
work and be able to correct it in in a
positive way because I think there is a
bit of this care element of it if it's
not if it's not gender balance that we
should drop the solution and not
understand what's wrong with it and how
to fix it we have to be a little bit
more positive we are here we're going
through a major change not just in terms
of the binary gender we have a different
view on many of the traditional values
of society and that takes time now I
remember that I read yesterday when
which is something that Einstein said
for the 20th century it's so sad that we
can smash an awesome but we can't smash
prejudices guess what 21st century is
gonna be so different because look at
how much progress we have made in the
last five years alone and if we don't
continue to make that progress I fear
we're gonna have a society where we
don't trust women I think that's what
the stakes are are we going to build a
society where information data built
primarily around a very small powerful
group in society sets the terms of the
debate for everybody else and everybody
else that's an anomaly outside of that
norm and so what our role is now I think
everyone in this room is to is to join
in in a charge that ensures that by
design we are building technologies that
serve the needs of all people across the
gender spectrum could you elaborate on
that
trusting women so is this that people
don't believe women as authorities is a
dangerous question or well if we think
about how a eye is rolling out in a data
saturated world we're seeing how we
trust certain kinds of information will
shift and I know later in this
conference on the stage we're gonna have
conversations about deep fakes we're
gonna have conversations about again
content moderation about other kinds of
ethical challenges are we building a
world in which people who see serious
challenges or ethical issues with data
and information are disempowered by the
technologies that are around them I fear
when we see numbers like the ones that
came out from oh and ask that show that
women disproportionately hold the jobs
the frontline service and retail sector
jobs that are at risk for automation
we're seeing the loss of a lot of rich
on-the-ground knowledge about how the
world works
at risk for something that's relatively
shallow digital knowledge built from
systems that overwhelmingly privileged
male white northern kinds of knowledge
rather than what we see around the world
and a better reflection of humanity so
the report calls for the future work to
be intersectional interdisciplinary
collaborative international and I guess
that's most relevant for this panel
trance sectoral and how can we
collaborate between different sectors to
address those issues that remain I'll
jump in with one quick suggestion which
is a I'm questioning the extent to which
ethics should be in a different room or
in an on a different stage than
conversations on AI and I think it's
great that it's here it's a starting
point but we need to force these
conversations onto the people that don't
want to have them and not onto
self-selecting audiences that are
interested in them and
all critical I think having an ethic
stage is a great step but I'm worried
that the right people on in this room
the work that my team's been doing
really shows that if we want to build
better data science and AI products we
have to bring social science and
humanities knowledge into the core of
that work and that we don't get to
better design a better technology design
without better understanding of people
we have a lot of people in the world who
understand people including people
themselves I know that sounds recursive
but but that's I think partly broadening
out the conversation about about
building that better tech that we have
to have no ethics for AI should be a
story so we don't have this stage next
year because all a AI is gonna be
ethical but in order to get there
there's a bit of work to be done and
I'll continue saying
I'm keeping my hopes high that we need
to find that dialogue and translate the
amazing work the research has has been
doing for the decision-makers so that we
make sure that whatever is actionable
its addressed now when those solutions
are being adopted and the adoption of AI
it's gonna surprise us all I mean my
data science team keeps on telling me
it's like we are surprised how quickly
those solutions can come to life again
a hundred million smart assistants being
being sold with two hundred next year I
think that shows you that 200 million
that shows you that we don't have a lot
of time that AI is here so we need to be
handling on all the researchers being
done in one way or another and you guys
see fi other again great places to have
those conversation and I encourage all
of you to be part of those conversation
and provide feedback and insight because
we need you we need to build it together
right this is we're not building an
ivory tower thank you
so we have come to the end of this panel
and the main takeaway is of course next
month the AI and gender report will be
published mid July do have a look it's
going to it's it's a vastly encompassing
and encouraging report shaping what the
field of AI and gender should be doing
for the next coming years I also love
that other takeaways from this panel are
a that we should all learn to knit
crochet and we and B that we should all
not be here next year next year we did
do indeed hope that all a I will be
ethical meanwhile please join me in
thanking our panelists
you
