[MUSIC PLAYING]
KAI: Up next, we
have Tricia Wang.
She is a technology
ethnographer,
which I love that term.
Makes you really think.
She's the co-founder
of Sudden Compass where
she's been helping enterprises
unlock opportunities
in their big data.
So she's really bringing
a human lens to data
and helping companies to get
the most out of their data.
She's worked with companies like
P&G, Kickstarter, Spotify, GE.
And she's also been an
IDEO expert in residence
in the past, so she's been
doing a lot of things.
And one of the reasons why
I was so excited to invite
Tricia here is how you bring
and utilize data in a sprint
is really, really
critical to the outcomes.
And I have struggled with that
myself, as a Sprint Master,
figuring out different
ways to do that
and how to make the
most of our data,
and empowering and giving
Sprint Masters new ways
to think about that,
new ways to think
about collecting that data
and answering those questions.
So I'm very excited
for Tricia to be
bringing her perspective here.
And if you could all join
me in welcoming her--
thank you.
[APPLAUSE]
TRICIA WANG: Thank
you, [INAUDIBLE],,
for that generous introduction.
It's really great to be
here in this community.
I think these kind of spaces
are actually really rare.
I don't know about you guys, but
I go to a lot of conferences,
and they're usually
big and impersonal.
But to have something this small
and curated is really lovely.
And I have not been to
something like this in a while,
so thank you to Kai and her
team for organizing this.
And you guys have brought
really amazing people here.
I want to be clear that
I am not a Sprint Master,
or I'm not a master facilitator.
You guys do this
for your living,
and there's a real art
and science to this.
And many of you
I've learned from
and I've watched over the
years, so I want to first say
thank you.
We just happen, at my company,
to use a sprint-based approach,
but we don't sell sprints
as part of our work.
So I want to be clear
that I'm really here
to learn from you guys.
But what Kai has
asked me to do is
to really share our approach.
So I think it's a
bit unfair that this
is me talking for
most of the time,
so I hope that we
can talk afterwards
about what resonated with
you, what could you bring back
into your approach.
And if you have
questions or anything
that strikes you, or
pictures, feel free to--
this is my Instagram handle
and my Twitter handle.
Can be reachable there, also.
So usually, my talks
are about the importance
of understanding
human beings when
building technology products
and showing organizations
the destructiveness of what
happens when they over-rely
on quantification.
But today, instead of talking
about how we, as a society,
have arrived at this
point of quantification,
and instead of talking about
why it is so important to not
engage in
over-quantification, I'm
going to take a different route.
I'm going to talk about
what we do at Sudden Compass
to solve this problem.
Kai, being as
convincing as she is,
said that this was truly a
safe space of practitioners
who would want to
understand actually
the guts of what we do.
Because I'm pretty
allergic to any time
consultants come
up and talk about,
this is what we
do in our process.
But she swore to me,
this is the place
where this would be welcomed.
So I'm going to share
a bit about what
we do to help enterprises
work with data
and to solve business problems
in a human-centric way.
And her ask was
really well-timed,
because our goal this
year at Sudden Compass
was to open-source
some of the practices
that we've been testing and
refining over the last five
years.
And now, we really want to
make this available for anyone
anywhere in the
world to leverage it.
So usually, while
we only reserve
these kind of things for
one-to-one discussions
with our partners or
potential partners,
we're really excited to
share this in a community.
And so after this, we'll be
working with Kai and her team
to put our stuff up on the
Google Design Sprint site
so that you guys can
make use of this.
So I want to start off my talk
today with this question of,
how do we empower humans in a
data-centric and data-driven
world?
And this is essentially
the one question
that we obsess over
at Sudden Compass.
My partners and I--
we've been working
to answer this question
with companies,
ranging from startups
to Fortune 50 companies.
But our story does
not start with data,
it starts with donuts.
I had been business
dating for a few years,
searching for the
perfect business partner.
And after being ghosted and
disappointed by several people,
one of my good friends
said, Tricia, don't give up.
There is a perfect business
partner out there for you.
And I think you should
talk to my best friend.
So Jason Li said, my best
friend is Matt LeMay,
and he loves talking
about data all the time.
And he also has seen some
of the problems you've seen.
I'm like, how is that possible?
So I arranged for the first
chat to happen at Dough?
And has anyone been to Dough?
Yes, a few people.
So is it not the best
donut like in the world?
Right?
It is seriously the
best source of donuts.
It's in Brooklyn,
Bed-Stuy, New York City.
If you're ever there,
please call me up.
I live down the
street from there.
And I made a rule that the
only way I can go there
is if I take someone new
to introduce them to Dough.
Because the first year
it opened, I'd be like,
oh, I have to walk by Dough.
And so this is Matt's
face when he took
his first bite of the donut.
Clearly, it's like the,
oh, donut face, you know?
And he agreed from
that point on.
And so from that point on,
we've been talking about donuts
and data ever since.
And if you haven't
had the pleasure
of meeting Matt in person,
you might recognize these two
excellent books that he's
written about some of our work.
And Matt's
background, as you may
have guessed from
the second book here,
is in product management.
And mine is what's
traditionally known as research.
And while the work we had
been doing is so different--
it comes from different
parts of the companies--
the overall trends
we were observing
of what was happening in
the modern enterprise--
and also the
startups, when I say
modern enterprises I'm
also including startups--
were shockingly similar.
We both saw organizations
talking really big games
about innovation and
technology, but falling back
into business as
usual, focusing mostly
on the reality of
just optimization.
So companies were
being like, we're
going to disrupt
business as usual.
But really, what they
ended up doing was just,
we're going to optimize
business as usual.
And as a researcher,
the way I was seeing it
was that I saw companies
heavily investing in big data
tools that were only used
to optimize their existing
business model, rather
than actually engaging
entirely new customers
in a new way.
And I saw budgets for anything
that was just qualitative
being slashed and
just being given
straight over to quantitative.
And in particular, at the
beginning of my career,
I quickly realized
that businesses usually
don't put researchers
in a strategic role
to drive growth, which is why I
created this new title called,
tech ethnographer,
because I thought
it's better to create confusion
than to have them put me
in a box of a researcher.
Because at least they
have to be like, why?
What does that mean?
And if you don't
ask me why, I'll
just know that
they didn't listen
to what I said at all,
because no one knows
what a tech ethnographer is.
I made it up.
And then as a PM, Matt was
seeing companies over-rely
on A/B testing and other
optimization methodologies.
He had been an API
evangelist at Bit.ly,
had led Songza through
their acquisition by Google.
And he's seen it all.
So he saw that rather
than using data
to really discover customer
problems and to solve them,
he just saw that companies
were using data to optimize
their existing business.
So real quick--
I'm curious.
Raise your hand if
your organization
loves to talk about innovation.
So half of, at least, the room.
And raise your hand if you
feel like that the reality
of your day-to-day work
lives up to this rhetoric.
One person, so far.
What's your name?
We should actually tell
everyone who you are.
What's your name?
[LAUGHTER]
Wait.
Can I see your name?
What's your name?
AUDIENCE: John.
TRICIA WANG: John what?
We need your last name.
AUDIENCE: Elongey.
TRICIA WANG: Elongey.
Where are you from?
What organization are you with?
[LAUGHTER]
AUDIENCE: I make
video here, right?
TRICIA WANG: Oh, so
your organization
lives up to-- that's amazing.
So Google lives
up to the reality.
But not many.
AUDIENCE: [INAUDIBLE]
TRICIA WANG: Oh, I'm sorry.
I thought you said
you make videos here.
[LAUGHTER]
AUDIENCE: No.
I work at Cisco Systems.
TRICIA WANG: Oh,
you work at Cisco.
Great.
So that is amazing that
Cisco is doing this.
So we should all be talking
to John Elonge afterwards.
We all saw his face.
Most of us don't work at
organizations, usually,
where the actual actions
lives up to reality.
And if you look
around, you'll see--
for the people who
didn't raise their hands,
which is like
99.99% of the room--
is that we're not alone.
This is something that a lot
of people and organizations
are struggling with.
And if innovation was
just seriously that easy
of a problem to solve
by throwing more money,
more data at it, than
we would have solved it.
We would have totally
solved innovation by now,
because throwing money at
data and technology is easy.
Just about every single
company in the world
is doing this through
some kind of project,
through some kind of
digital transformation work.
But more data and
more technology
does not mean more innovation.
Now, I want to be clear.
I know this is a super
controversial statement.
You have to understand,
it's not easy to walk in--
like, talk about courage
with what Kai opened up
the first day with.
Walk into a company
and telling them
that, hey, all those
things you're doing,
throwing more data
and technology,
it may not actually make you
better or more innovative.
This is really controversial.
And if you look at some
of my other talks online,
this is the core
message I work on,
which is the topic
of my upcoming book.
And so this is something
that is very controversial.
I'm not going to go
into why, but what
I want to maybe
just leave one thing
with to explain why
this is the case,
is our friend Maciej Ceglowski--
one of the awesome companies
in the world,
Pinboard-- has pointed
out one very clear
example that proves this,
that shows how this is true.
And he talks about the
pharmaceutical industry.
He talks about
how big pharma has
been seeing a decrease in their
ROI and their investments.
So even though pharma
is putting more money
into drug innovation, less
drugs are being developed.
How can this be?
They were putting more
money, more technology
against this problem.
And the answer is in the
logic of Eroom's Law, which is
Moore's Law spelled backwards.
[LAUGHTER]
And just as a quick refresher--
we all know Moore's Law
where Moore's law is
saying, hey, look,
processor speeds double
every two years in proportion
to the money that we
invest in chip development.
So essentially,
Moore's Law is saying,
hey, the more you
spend on innovation,
the faster chips become.
So that's Moore's law in red.
What Eroom's Law says
is that drug development
decreases every nine years.
So the more you invest in
drugs, the less drugs you get.
And the reasons for this
are multiple and complex.
And it's quite plausible
to be like, hey, Tricia,
yeah, that's just big pharma.
There's a lot of
regulation there,
so that's a little
different from what we
do where it's less regulation.
But I would say, look, we're
seeing this across the board.
This is the trend in
every single industry.
According to research from
Anne Marie Knott, a professor
of strategy at
Washington University,
her work shows that returns
from investments in R&D
have declined 65% over
the past three decades.
This is an alarming
number because this
means that no matter how
much we pour into innovation
of digital transformation,
of doing sprints work,
of being more agile,
be more lean, something
is falling short.
How can we be so into innovation
and be getting less out of it?
And Anne Marie's
answer, her short answer
for why this is happening,
comes from her book,
where she says, look.
In her book, "How
Innovation Really Works,"
the answer is simple.
She's like, companies have
gotten worse at innovation.
That's the simple answer.
Now, innovation looks
a lot like pharma, even
for non-pharma companies.
And the reason for that is
actually really, really simple.
With so much at stake, with so
much customer unpredictability,
and with the size
of the enterprises--
and especially if you also
startup with so much capital
and risk you've taken on--
it's oftentimes just
easier for companies
to invest in optimizing things
that they already know work,
than to try to discover
new things altogether.
So pharmaceutical companies,
like many modern enterprises,
are stuck in the trap of
optimizing existing solutions
and calling it innovation.
The inability to break
out of optimization
is directly related
to another challenge
that we see at companies.
It's that all these companies
are claiming to be data-driven.
Everyone is really
excited to say,
we're now a data-driven company.
And what that often
means is we're
quantitatively data-driven.
It does not mean that
they're insight-driven.
There's a humongous
difference between being
data-driven and insight-driven.
I'm not going to get
into those details now.
Those are part of other talks.
But the point is
that when companies
talk about
data-driven, they only
are referring to
quantitative data.
And this means that
anything qualitatively new,
anything that can't show up
in their existing models,
anything that doesn't fit in
a spreadsheet, like smiles,
or tears, or stories, or trust--
we've been talking
a lot about that--
that doesn't fit
in a spreadsheet.
Anything that's emergent like
that, it can't be seen at all.
And they wonder why they
struggle at innovation.
And now, I've encountered this
bias towards the quantifiable
a lot in my career.
Here's one example I keep
coming back to that really
made me have the aha moment.
It was back in 2008,
and I was working
as a researcher for Nokia.
I'm not going to go
into the story now.
I'm going to give you
the one-minute version.
You can find the
full story up online,
as it's a case study I've
talked about extensively.
But the 30-second
story is that I
was sent to China when I
was working at Nokia in R&D
to gather qualitative data
to understand what customers
wanted.
And Nokia sent me
there, and I went there.
And here's some pictures
from my fieldwork of doing
ethnographic qualitative work.
I worked as a dumpling seller,
which is a street vendor.
I lived in slums.
I lived with
construction workers,
trying to understand
secondhand cell phone markets.
And let me tell you, if you
were me, if you had been there,
you would have seen it too.
It couldn't have
been more obvious
that Nokia was going
to go out of business,
because people would not want to
buy smartphones in a few years.
And so I went to Nokia
and I told them that.
I said, look, you're going to go
out of business in a few years.
And I did it, of course,
much more diplomatically
and strategically.
But I essentially said,
here's all my studies.
Here's all my research.
And here's my
business plan for what
you can do to
start experimenting
with a new business model
of working on smartphones.
And they were
like, you're crazy.
The iPhone has just come out
in 2009, and they're like,
the iPhone's a fad.
A lot of smart
people thought that.
And look, they said, we have
millions of customer data
points from marketing,
from analytics,
from our BI departments.
Not one single person is telling
us that our business model has
to change and that
our customers--
because we're number
one in the market--
are going to want
to buy smartphones.
And I was like, of
course, because I
have been collecting
qualitative data that's
beyond your existing
business model.
So of course, I'm
looking at stuff
that's invisible, that's
unknown, and that's emergent.
So it's OK.
Let's work together
and let's actually
start validating the
stuff I'm seeing.
But they were like,
no, no, no, and I left.
We all know how the
story of Nokia ends.
By the time that their
quantitative models had
been adjusted to account for
emergent qualitative insights,
it was too late.
So imagine what
would have happened
if Nokia could have accomplished
using qual and quant together
to become insight-driven.
Imagine what would
have happened.
I think they would have
been the first to pick up
that there was a demand
for smartphones in China.
And maybe we would all be doing
the Nokia Sprints conference
today, instead of
Google Sprints.
I mean, who knows?
But we'll never
know because what
I came to realize
through my experience
at Nokia, that it was not
unique to me or to Nokia
that this was so
widespread, this problem
of not being able to
understand anything
that didn't exist in
a mathematical model,
that I gave it a name.
And it's called the
quantification bias.
And then this is the topic
of the book I'm writing.
And this is a bias.
This is a modern
bias that is specific
to the data-driven environment
of enterprises and businesses,
that is the unconscious belief
of valuing the measurable
over the immeasurable.
And we often see it in our work.
It's when someone becomes
so focused on the measurable
that they can't even pay
attention to any evidence,
even if you put it in
front of their face.
I studied this a lot at one
of my first jobs at NASA.
This is something I've talked
about with this Challenger
space shuttle crash,
that the quantification
bias is something that also
played at large companies
like NASA.
And I saw it then
later on in my career,
when I entered at Nokia,
and then at other companies.
And has anyone ever experienced
a quantification bias
in your work?
Yes.
It's like, when you know
you've seen something,
and then someone's like,
oh, could you put it
into a PowerPoint with a chart?
Could you put some numbers
and get us statistics on it?
And so this is
such a big problem
that I think we need to talk
about this much more openly,
because it's preventing
us from actually
doing really great
work with clients
and within our companies.
So it was odd to see
the quantification bias.
And much relatedly, it was
weird at the same time,
because then companies
that we were working with,
including Nokia--
this was like 10 years ago,
but including Nokia-- they
were talking about
customer obsession.
And I was like, how
is it that companies
can be so data-driven
quantitatively and then
at the same time, be
like, we're innovative
and we're totally obsessed
with the customer?
How can that live
in the same world?
But actually, what it
seemed like they really
cared about was more
about tools, frameworks,
and technologies that they
were using to understand
about their customers.
And if your company has
ever spent months evaluating
a technology tool without once
taking into account the impact
the tool will have on
the customers, if any,
then you might have
fallen into this trap.
Or you might have witnessed your
clients falling into this trap.
And indeed, in
many organizations,
they undergo a digital
transformation.
And they actually
wind up farther away
from their customers
because of this,
because they're so obsessed
with getting the right tool.
So here's a circle on
your left that represents
the modern organization.
There's a hierarchy, there's
structure, there's departments,
and there's silos.
And in this kind of model, the
people at the bottom report up,
and the people at the
top make all decisions.
Where do you see the
customer in this?
AUDIENCE: Outside of the circle.
TRICIA WANG: Outside the circle,
and I heard at the bottom.
That is correct.
And what do you
think happens when
a company-centric organization
like this adds digital tools
and says, we're going
to be innovative,
we're going to do
digital transformation,
and we're going to
do sprints, and we're
going to now have dashboards
and everything be measured?
What happens to the customer?
Do they get farther or closer
away to the organization?
AUDIENCE: Farther.
TRICIA WANG: Farther away.
This is what happens.
This is actually what the
modern organization looks like.
It's still business
as usual, but it's now
encased in this membrane
of big data technology,
with dashboards from all
these data science companies
that all the big four
consulting firms have
recommended every company buy.
And the company is still
doing the same thing.
The people at the bottom are
still making the same reports
to send up to the
people on top to make
the decisions, who have no idea
what the customer is doing.
And as a result, this
is the hard reality
that we've arrived
at for this paradox
that we see inside
organizations,
is that organizations can
become less customer-centric
when they invest in
data and technology
to better understand
their customers.
Because these organizations
are using technology
as a magical tool to
replace understanding,
and not as a tool to
activate understanding.
So keep in mind that these
problems persist at companies
using a wide variety
of toolsets, supported
by functionally
limitless budgets
and armies of consultants
using methodologies
and frameworks like agile,
and design thinking, and lean.
And talking through
our experiences
from product and
research, for Matt
and I, what we saw at all
these large and small companies
is that they're using every
conceivable tool possible,
every framework.
And is it these
frameworks' faults?
No.
It's not like there's something
wrong with lean, or agile,
or sprint-based work.
It becomes so clear that the
problem stopping companies
from innovating--
they're not technology problems,
they're human problems.
And as such, they
can't be solved
by technology alone,
or adopting some kind
of sprint-based approach
alone, or just bringing in
agile or a digital
transformation.
And this is what's brought
us to this critical question
that we're obsessed
with answering,
which is, how do we empower
humans in a data-driven world?
Because we can quantify, we can
collect, we can store and query
all the data there
is, but if we don't
know how to use it to turn
the data into insights,
then what use is it?
And that's the question
we set out to answer.
And I'm going to
share with you some
of those answers that we have
incorporated into our work
and how we built that into
our practice to actively work
with our partners deep into
the guts of the organizations
to actively undo the
quantification bias.
There's three things I'll share.
So the first thing
is that we must
empower humans in
the data-driven world
to operationalize the
difference between optimization
and discovery.
This means that at every
level of the organization,
we need to have a clear sense
of what we're doing and why.
And we need to choose our
practices, and approaches,
and methods accordingly.
Operationalizing
this difference is
how we resolve the tension
around what we were seeing
in this gap between the
rhetoric of innovation
and the reality of optimization.
And we're going to show you in
the following workshop-- well,
I'll show you in the
following workshop
after this on how we
operationalize this difference.
But for now, I'm going to
share just a quick story of how
we implemented this with
a streaming music service
company.
So during the discovery-led
qualitative sprint--
something we're going
to talk about soon--
we asked participants to show
us where they put the music
app on their phone.
And not surprisingly, people
had already organized their apps
into folders.
But then we heard a response
that was really interesting,
that changed the way we
looked at our existing data,
and the quantitative data,
and the way the engineers
had created the algorithm.
By the way, these methods
on all sorts of data
projects, whether we're
building AI, or blockchain,
or just doing
customer analytics,
or working on ad tech,
or programmatic stuff
in marketing.
So in this case, it was to make
some changes in the algorithm.
And in this case,
we heard a response.
And someone said, well, it looks
like my music folder doesn't
have any music apps in it.
And we were like, why is that?
And that person said,
it's all podcast apps.
And soon after that, we started
to see this as a pattern.
And we started to
actually change.
This qualitative insight,
that was very discovery-based,
made us realize that
this was hugely impactful
for this company's
strategy, because they soon
realized that their biggest
competitor wasn't other music
apps.
That rather, the
things that people
do with their precious time and
attention was their competitor.
And more specifically,
it was podcasts.
And people on this team said
that they would have never
even bothered to ask
that question, to say,
hey, where do you
put your music app?
Because they said, it would
have been too open-ended.
And it would have been
too seemingly focused
on something specific.
They would have
said, we would rather
focus on specific
in-app behavior
as opposed to asking a big
discovery-level question of,
hey, where do you
put your music app?
Because they're like, we
would have thought we already
would have known the answer.
So they wouldn't have seen
this emergent behavior.
And this insight proved
really, really impactful.
Because then they said,
look, now that people
we know people do
this, it actually
changed their business.
And it was so
transformative that they
shifted their entire strategy
to now include podcasts
into the work they do.
So some of you may
know this company.
It's on your phone, likely.
We all use it.
Most of us use it.
And this is the only
kind of business-changing
transformative insight
that would have come out
of doing discovery-based
work, rather than falling
into the relentless incremental
optimization work masquerading
as innovation.
And in addition,
let me remind you
this company has
spent a lot of effort
launching their
personas over last year.
None of that work
surfaced this behavior,
because it wouldn't have
fit in that persona,
because a lot of the way people
interact is based on modes.
And modes of behaviors
can be transversal,
depending on context.
And so they had fallen
into thinking, why didn't
our personas tell us this?
I'm like, because
it's persona work.
Persona work is
a qualitative way
of putting people into
quantitative segmentations.
The second thing we must
do to empower humans
in a data-driven
world is to integrate
qualitative and
quantitative data.
And this is how
we resolve the gap
between the rhetoric
of being data-driven
and the reality of being
quantitatively data-driven.
Now, both qualitative
and quantitative data
can be hugely valuable
when you integrate them.
And that's really the
most important point.
It's not that they're
valuable on their own,
it is more valuable when
you bring them together.
Now, quantitative data is
often referred to as big data.
You may have also heard the term
thick data, which is also how
we refer to qualitative data.
I created the term, to make it
sound more sexy than big data,
because I really,
when I was sitting
in rooms of data
scientists and engineers
who only wanted to
see numbers, and they
would call my data
sets small and puny,
I was like, you know what?
If you have big data,
then I have thick data.
And now they actually
want the thick data.
And so this is a term that
I've given to the world,
so please use it.
There's a lot of
writing about it online.
I see job descriptions now
for people calling for,
we're hiring thick
data researchers,
which is a total success,
because people don't even
know where the term
originated from,
which means it's definitely
being used out there
in the world.
So for companies to
have a full picture,
they need the depth of
qualitative insights.
You get a small n, but
you get really high depth.
And you also really need to
have big data, quantitative data
where you get a big N, but you
get a low depth of meaning.
And what do I mean by that?
So here is a slide that's
taken out of our workshops.
Here's a side-by-side
that illustrates why
this integration is critical.
For big data, you're leveraging
machine intelligence.
But for thick
data, you're really
leveraging human intelligence.
What big data offers
you is content and scale
that's unbelievable.
But to get all that
content at that scale,
you have to
standardize, you have
to scrub, you have to
normalize, you have to cluster.
All that just rips out
the context and depth,
so you have to integrate
it with thick data
to bring back the
contextual loss that
happens from making quantitative
data useful and analyzable.
And with big data, you
get a lot of "whats."
When you get statistics,
you get, well, 30% of people
are doing this at this time.
The minute you ask,
why did they do this,
why, you have to eventually
go back to the thick data.
There's no way to avoid it.
And the amazing thing
about quantitative data
is that you get a lot of knowns.
It's telling you things
that you already should be
understanding, but it's limited
in that it cannot tell you
things that are emergent.
And if anyone here is
obsessed with systems theory
and emergence
theory, then you know
that we live in
a dynamic system,
and there's always unknown
things happening all the time.
And so this is why we also
need to have thick data.
But somehow, in the era
of being data-driven,
and digital transformation,
and innovation,
organizations have
decided, hey, we
don't need that thick data,
those qualitative people,
those researchers, those
people who understand humans.
We're just really going
to spend all of our money
hiring data scientists.
And we're going to
put them in a room,
and they're going to help
us grow our business.
And so they've gone on
these hiring sprees,
and these new
department-creating sprees.
And this is where most of
the money and resources--
this is where the
decisions are being
made, on this side of the
room, which is on big data.
And what this means is that
this is, I think, really scary.
And which side do you think
your organization is on?
Is it more on the
quantitative side?
On more the qualitative side?
Or are you guys
equally balanced?
I just ask you to think
about that for a moment,
of where you might be.
And so during our
workshop, we're
going to give a framework
called integrated data thinking,
and I'll show it to you in
a second, a preview of it,
of how we integrate
big and thick data.
But here's a quick story
of how this actually
comes out and plays
out in real life.
A few years ago, we were
working with O'Reilly Media,
a technical publisher of all
those books with animals.
We all know them, right?
This is how many of
us-- if you code,
or if you work on
databases, like early ones,
you see all their
books everywhere.
And as a tech leader, they
were really struggling
to fill the seats at
their conferences,
like Velocity, Strata, all
their big tech gatherings.
And their quantitative
data couldn't tell them
what was going on.
It just couldn't.
Like Roger Margolis,
who actually coined
the term, "big data",
is their head of data.
And he said, look, I've
done all the queries.
I, for some reason, cannot
figure out why our conference
attendance is dropping.
Why are we not getting
seats filled in as usual?
And so we ran some
qualitative sprints with them,
teaching them how
to do it themselves,
so we modeled it
for them, again,
operating, being
clear that we're
going to move to the discovery
level, not the optimization
level.
And what we found is that
within many corporations,
people's budgets are
explicitly tied to one word.
Do you know what that one word
is for conference budgets?
Training.
That's all.
That's all it is.
And so we found this out
within one day in a sprint.
And it's that their conference
workshops weren't getting
results because they were
calling them workshops,
and not training.
Because people's L&D budgets
were connected to training.
It was just so simple, but
their perception of reality
was different from
the people who
were working inside companies.
So we identified
some quick actions
that we could take to change
the wording on their site,
and ticket sales
went up immediately.
And on top of that, this
critical qualitative insight
made their quantitative
data even more valuable.
We went in and we helped them
redo their quantitative surveys
to help them better
understand the scale and size.
Because we were like,
actually, your reach
is even bigger if you're
actually tying your product
to trainings and not workshops.
And last, but not least,
is the third thing
to do to bridge this
gap of empowering
humans in a
data-driven world is,
we must bring decision-makers
and customers closer together.
This is beyond critical.
And it's one of the
most important ways
we can empower both the
humans we work with,
our colleagues, and
the humans we serve,
which is our customers.
This is how we resolve
the break in reality
between the rhetoric
of customer obsession
and the reality
of tool obsession.
When we return
back to this image
of the modern
organization, what we
noticed before is that the
customer is at the bottom.
But there's another
thing that's happening
that is so normalized
that it's just how things
work at most companies.
When we look at the customer
being farthest away,
one of the things that Matt, my
business partner, Matt LeMay,
realized, and he's written
about it in his book, "Agile
for Everybody," he
describes this reality
as a characteristic of
most modern organizations.
"The people whose decisions
impact customers the most
are the ones who interact
with customers the least."
Wtf?
Like, how can this be possible?
This is so ingrained
that it's just
an implicit benefit of getting
promoted, of yes, the higher
up I move, the less time I'll
spend with the customers,
and the more I can demand for
people to make PowerPoints
for me.
And then I'll make decisions
through PowerPoints,
and I don't have to
go into the field.
Because you know what?
UX researchers and
those researchers,
they go into the field.
They have to go write
the customer notes up.
Then they're going to make that
into a PowerPoint to tell me
what decision I can make.
This is totally convoluted.
And this is totally
bizarre because this
is the source of disconnect
between what companies perceive
their customers want and
what customers actually want,
how they act, and how they live.
And ultimately, it
is their customers
who determine whether the
companies succeed or fail,
whether a startup has a
next round or a next series.
We believe this so strongly,
that decision makers
need to interact with
customers directly and customer
data directly,
because we believe
customers and the people
who work in companies
live very different lives,
so one of the things
that we did in one of
our earliest case studies
when we were working with a
large CPG company based out
of the Midwest--
they said, look, we want
to expand to urban markets.
And for the executives
at the company,
they wanted to sell more laundry
detergents in urban markets.
And for the executives,
this is what
their laundry rooms look like.
And has anyone here ever done
laundry in New York City or San
Francisco?
Does it look anything like this?
AUDIENCE: Nope.
TRICIA WANG: Nope.
[CHUCKLES] Not at all.
And so it looks like this.
This is what the reality is.
For all their data
that they had,
they could not understand that
their customers were actually
doing laundry in places that
looked like this, until we
told them, pack your bags.
You're going to leave
your Midwestern town,
and you're going to go
to Brooklyn for a week.
These are executives who
managed the P&L. These
are GMs who we told, you
have to get up and leave.
And I cannot overstate how
uncomfortable they were.
I had to even tell them,
please don't wear your suits
to the laundromat.
But we're going to go to
the laundromat together.
[LAUGHTER]
You're going to bring
your dirty clothes.
You're not going to send it out.
And they happen to be men.
I'm like, no, your wife
is not going to do it.
You're going to do your laundry.
And they had to leave their
comfortable, cushy offices
and talk to people in a setting
where they weren't the boss.
And they actually could not
pretend they knew everything.
And what they found
out was amazing,
because they said, look, we
thought our entire marketing
plan was created
for laundromats,
thinking that people do
laundry in something that
looks like this.
And the concept was
around convenience.
And they're like, we
know it doesn't work,
because it didn't make sense
for their urban customers
who ended up finding a way to
make these experiences actually
valuable, which is the
way they do their laundry.
And what we heard from people
was some people were saying,
look, I come here to do
my laundry in Brooklyn.
This is the only time I can
get away from my partner.
Or this is the only
time I have to myself.
[LAUGHTER]
This is the only time I
get away from my kids.
This the only time
I can read a book.
This is the only time
I call my mother.
And one of the most
fascinating points
we heard when were like, why
is everyone so dressed up?
And what we realize,
like actually, it's
also a physical Tinder spot.
This was also the best hookup
spot in Brooklyn, apparently,
because everyone was
always looking good.
And so we're like,
look, these are
things that you would
have never found out
through any of your
surveys, through any
of your Ipsos' worth, or
any quantitative data, any
of your third-party data
that you would have bought.
You would never know that people
have found-- not that they love
doing laundry, but
people have found a way
to make the experience
valuable in their way, right?
And so based on this
direct interaction,
they redirected their entire
marketing campaign within one
week--
something that would have
taken several months,
or what they said, it might
have been impossible to change,
because they would said, well,
we've already made the plan,
so we can't change it now,
because we got our agency
partners.
And by the way, we brought
the agency with them
so they could
experience it, too.
And so there was no
excuse in the book,
because everybody was there.
And it only took one
week, which is very hard,
but it saved them much
more time in the end.
In the end, they
made $300 million
in revenue in just
the first year
based on this campaign switch.
So to summarize, these are the
three answers we've come up
with in our five years of
working as some of the top,
most important things to empower
humans in a data-driven world
to resolve all those gaps
between rhetoric and reality--
operationalize the
difference between discovery
and optimization.
And then second is integrate
qualitative and quant.
And the third is bring
decision-makers closer
to the customer.
And you may have noticed
one of the key words
here is "operationalize."
And we believe that this is
so important that we put this
into our practice,
because it's not just
enough to say these things.
You actually have to find a
way to get people to do this.
One of the major challenges in
operationalizing these ideas
is that most modern
organizations
continue to work in silos.
So broadly speaking, you've
got your marketing folks,
and then you've got
your product folks,
and then you've got
your business folks,
and then whatever other
departments you have.
It could be like sales,
or finance, or design.
And then marketing usually
has very pronounced silos.
You have the silos
within silos, right?
Like, marketing has performance
marketing, brand, and agency.
And then imagine it happens,
the same thing within each silo.
And it was really clear
to us that each silo
had their own frameworks,
their own language,
their own consultants that
they wanted to bring in.
And it was clear
that, from the outset,
however we operationalize
these ideas we've discussed,
it needed to bring
people together
and to bridge these silos
and not make it worse.
And so we've spoken a
little bit about the sprints
that we run in our work, and
we call them Unlock Sprints.
And they represent our attempt
to operationalize our three
ideas on how to empower
humans in a data-driven world
into an easy, accessible,
and repeatable practice
that anyone, literally
anyone, can do.
So we took the best
practices out there.
I used to do some work with
[? Idios. ?] I took the best
practice from design thinking.
I've done a lot of Agile.
I took the best of Agile.
Matt has done tons
of Agile and Lean.
And we just took the
best of everything
that we learned from everyone,
many of you here today.
And we said we're
going to just create
such a simple, easy,
four-step process.
We're not going to call it
anything fancy because there's
already great things like
Agile and Lean out there
in design thinking.
We're just going to say these
are the four steps to do,
and you can customize it the way
you need for your organization.
And the idea here
is pretty simple.
It's getting a cross-functional
group together,
including decision-makers, to
ask a mission-critical business
question, to acquire the
diversity of data needed
to answer that question, to
analyze that data together,
and to act.
And that last step
is so important
to make a decision
on what to act on.
And over the last five years,
we've iterated on this practice
a number of times.
And we wanted to share one
particular important part of it
with you today, right
now, before I end my talk,
because I know not all of
you can be part of the lab.
So I wanted to share
one step that I
think is critical, that I
think, that everyone can take.
And I would love to
know how would you
incorporate it into your work?
And this is the thing that we'll
be sharing at open sourcing
and putting on Google
Design Sprints' site.
And so if you notice here,
there is a critical step
that happens between one and
two, between Ask and Acquire.
And there's something that
has to happen for people
to acquire the right data.
And so we've created a
framework to get people
to think about quantitative
and qualitative data,
along with discovery
and optimization
in a different way.
And so we created a framework
called Integrated Data
Thinking.
And it unifies a
way to work with
both quantitative and
qualitative data and discovery
and optimization.
And when I developed this
framework five years ago,
I remember calling my
mentor, Roger Mougalas,
who coined the term "big data".
And I was like, I think
I found the unifying
theory for how to
get organizations
to work with data.
And I think it's so easy
that anyone can learn,
even if you're not a data
scientist or a trained
ethnographer, that
anyone can just do this.
And this is the exact framework
that I had been missing, even
as a grad student where
I studied both statistics
and ethnography.
So we're going to dive into
this in detail in the lab.
But real quick, I
just want to show you
what it is so that you can get
a sense of how to potentially
even use it.
The first x-axis
operationalizes the distinction
between optimization
and discovery.
So you have
discovery on one end,
and you have
optimization, which is
finding known things,
discoveries about finding
unknown, new things.
And then the y-axis
represents the integration.
At the top, you have thick
data, which is qualitative,
and the bottom, which is
big data, quantitative.
And what this does
is that is allows
us to map the questions we're
asking in each quadrant,
thereby exposing the type of
data that must be gathered.
So notice that I'm going
to be questions-led,
and not methods-led.
For example, if we were
working with a company that
builds boats, we could have
a question that lands us
in the discovery qual quadrant.
Like, what does
mobility mean to people?
This is like a why, discovery,
unknown question, right?
So of course, you're going to
be up in the top left corner
where you'll be employing
tools like ethnography.
We could have a
question that landed us
in the more qualitative
optimization quadrant,
like, what would the ideal
boat for most people include?
And then we would move
down into a question that
was more quantitative discovery
quadrant, like, well, what
are some common interests
shared by people who buy boats?
There, if we had an
amazing data set already,
we could do some emergent
clustering analysis.
Because big data or quantitative
data, there's a range.
There's more
discovery-based techniques,
and there's more
optimization-based techniques,
right?
Or we could have a
question that lands us more
in the quant
optimization quadrant.
Like, at what price point does
a boat feel like an accessible
purchase?
But you can also just also move
into a qualitative optimization
too if you wanted to
run a focus group.
So there's different ways
you can play with this.
The point is that, we could
also run an intense survey
and find out exactly
at what medium
or at what is the point where
you have a most statistically
relevant point of distinction?
What's important here
is that the question
is guiding all future decisions
about the methodology and data.
So when it comes time
to choose a methodology
and run the remaining
steps of the sprint,
it's based on the
question we were
asking in the first
place, not the method.
So this allows our
partners to rapidly move
between quantitative and
qualitative methodologies
and optimization and discovery
as their questions change.
So ideally, for any one
product to be built,
we are working with
anywhere from 30, to 100,
to 200 sprints.
And for large teams, they can
be running parallel sprints.
You can have one
sprint team that's
both qual and quant together.
You can have data scientists
and ethnographers on one team,
or you might want
to split your teams
to say we're going to have
just the engineers on one team,
and we're going to have the
US researchers on another.
It totally depends
on the project.
Or you can have all
of it on one project.
So you can have multiple
parallel sprints.
We use this framework in
integrated data thinking along
with unlocked sprints to
work on a variety of projects
that require the integration
of big and thick data.
And these projects
require engineers
to work with
ethnographers on improving
like, let's say, an algorithm
on building an entirely new app.
Or these are the
kind of practices
that force people to actually
work together on one team
and to understand the value
that the other kinds of domain
expertises bring.
And what's really
amazing is that what
we see is that people who are
usually uncomfortable-- like we
see ethnographers,
or UX researchers,
or market researchers
being like, oh, sometimes,
I'm really uncomfortable
because I'm not
trained as a data scientist.
They actually become more
comfortable with working
with them.
And vice versa.
We see data scientists and
engineers who are like, wow,
I never knew that my work
could be so connected
to the human element and
that I could actually
join some of these sprints
and be really clear about how
a qualitative insight affects
how I weigh an algorithm.
And actually, an
ethnographer's going
to help me figure out how I
weight something, a variable.
That's amazing.
So we see total transformations
in how products are being built
and how even variables
are being weighted.
Now, when we work
with our partners,
we often find that they
have some other sprint-based
approach that they've
already been using,
which is because we have an
amazing community like you
guys already spreading the
word and teaching companies
how to do this.
And so that's why, over the last
five years, what we're doing
is we white label
all of our work.
That's why it's not a fancy
name, it's just Unlock Sprint.
And we tell companies,
call it whatever you want.
Insert it to whatever is
existing work you want.
Smash it up.
And some companies have created
eight steps out of this, right?
It's entirely different.
And many of our
partners just end up
completely changing it
and just figuring out
what's best for them.
And we're totally cool with that
because, at the end of the day,
we believe it's not the
tool that matters the most,
it's the people.
And whatever language,
whatever names
you want to call it,
what's more important
is how do you get
people in the room
to communicate around data.
So coming back to
the original question
that we started
with five years ago,
we're still answering
this question, which
is, how do we empower humans
in a data-driven world?
And while we continue to iterate
on principles and practices,
this is the most
important question
that we want to
answer is, we have
to empower humans together.
And I'm sure many
of you in attendance
know this better, here in this
room, better than anyone else--
that the magic of sprints is not
the sprints, it's the people.
It's the Sprint
Master doing the work,
it's the attendees in the room.
And that's why I'm
really happy to be here
in this community with you guys.
Thank you, so much.
[APPLAUSE]
[MUSIC PLAYING]
