- [Naber] Could you
talk a little bit about
your mindset as you're
using data-driven marketing methods?
And then take us through
maybe a couple of key points around
what businesses should be doing
and how businesses should
be thinking about this?
- [Gabriel] Sure, so the
first thing I would say is,
this applies, in my experience
and at least something that I,
you know, encourage or
push the teams to focus on.
This applies not only on most
of data-driven marketing,
I think it applies in business in general.
And that has to do with, you know,
the scientific method for sure.
But I would describe it better perhaps
as just being scientific,
towards identifying growth
opportunities, right.
And that doesn't mean that you need
to trade off creativity, that doesn't mean
that you just gotta
always follow a formula,
but it means that if you
do it in a methodic way,
you'll be able to
discern what's meaningful
versus not meaningful faster.
And once you're able to do that,
you can iterate on it
and that's what drives growth, right.
So it just starts with
an observation, right.
Or a data point and again,
this is where I said,
it doesn't have to be
data-driven marketing,
it could just be the point of view, right.
And that's fine.
But once you have that you need
an overall objective or hypothesis
that you can go and test.
If you don't have that,
then it's gonna be very
difficult to define success
and also very important to understand
within that objective,
what's in scope and what's
out of scope, right.
I've seen many, many to
know times in business
when there's a description
of what's in scope,
but there's no specific clarification
of what's out of scope.
And that just takes people in very, very,
you know, interesting paths,
where they spend most of the time trying
to explain why they're not doing something
as opposed to just focusing
on what was supposed to be
in the scope the first place.
- [Naber] Yeah, good one, good one.
- [Gabriel] A good example
of that would be, you know,
if you're developing, I'm
trying to be generic here,
I can give you specific.
But if you're developing
a new value proposition,
you know, for for your product
then your success metric is engagement,
then make sure that you
perfectly articulate
that driving incremental, you know,
customer growth is out of scope, right?
Because when you look
at your success metrics,
that could be a quite
normal questions like,
well, but are we driving
incremental is like,
Well, no, because that was not
the aim of this experiment, right?
- [Naber] Yep.
- [Gabriel] And have
very clear key results.
And if you think of the things
I'm touching on right now,
it's kind of following almost
the scientific method, right?
If you have key results
and you know exactly
what's expected output,
then you can define
what your success criteria is.
You know, what are the
primary and secondary measures
of success or metrics of success?
And you know, that also allows you
to identify and know
what a potential risks
or dependencies that are
could become blockers
into executing on that marketing strategy
in this case.
And of course, you know,
having an estimated
business value always helps,
right when it's not only about the metric,
but exactly what how much, how much value
is that gonna deliver to the business.
So that's in general terms.
Now, when it comes to, you
know, more specifically,
you know, the application
of that into technology
or everything that we're
doing with marketing
or data science, Expedia went
through this transformation
a few years ago, where there was just
a focus on making data-driven observations
and follow the scientific method.
And the way the data
allows us to operate today
has transformed the way
in which we innovate
in every single thing we do,
product, you know marketing,
even as I said in some
forums, even HR today follows
the scientific method in the way
they operated Expedia and
you're pretty much taking
real time information
and in think of an ecommerce environment
and letting your customer
tell you what they like
and don't like, in real time.
- [Naber] Interesting
can you give examples?
- [Gabriel] Well, the simplest example
would be an AB test, right.
If you're if you're AB testing something,
what people don't realize
is that you're actually
getting real time feedback
or even advice when it's
more with an audience or not.
How you take that feedback
and how you, you know,
iterate on that process
is what then drives the success.
But that's in the simplest
the simplest example,
right, there obviously,
many other more complex ways
in which you can apply
the scientific method,
you can apply data science
into the way that you're
automating process
and learning from those little loops
and learning from your customers
to improve your product,
improve your marketing,
but a few things that I would say
is key to you know, to focus on.
You know, from a marketing stand point.
If you think of how complex
marketplaces have become,
you know, the multidimensional nature
of the cross-channel marketing
world we live in today.
There are multiple, you know, storefronts
and touchpoints, right.
I always say the same thing,
which as marketers is
kind of unacceptable.
So to ignore the infinite amount of data
that is coming our way and everyone talks
about big data I know.
But if you really take that data
and ensure that the next
customer interaction
is being informed and
dictating by the data
that you've been gathering, naturally,
you're enhancing whatever
you're putting in front
of your customers, either a product
or a marketing message
or a value proposition
or whatever it is and in the past,
that linear sort of, you know,
path to marketing was pretty simple.
But today it doesn't work like that.
Today we have different devices,
today we have those multiple touchpoints.
Today we have complexity
from around, you know,
multi-channel attribution
and cross-device attribution.
And so it's a lot more complex.
And the amount of customer data out there
means that the customer
journey is quite fractured.
So unless we find, you know,
a specific ways in which we can start
to test with the data
that we're gathering,
it's very difficult to
deliver on measurable results
and iterate on those, which goes back
to what I was saying before, right
and just following that
simple, simplistic approach,
the scientific method.
So what that means in practice, again
when it comes to marketing, when it comes
to data-driven marketing and measurement,
if you're truly thinking
about the customer,
then in order to drive sustainable growth,
you need to understand the value,
that customer or set of customers,
brings in the long run, right
and I love this quote, which
is from a guy from McKinsey,
that that I think is an emeritus partner,
as they call them, he talks he writes
a lot about capitalism, right
and the future of capitalism.
And he says, "that the vast majority
of most firms value
depends on the results,
you know, three years from now,
which that's how it works."
But yet management is pre-occupied
with what's reportable
three months from now,
then capitalism has a big problem.
And that happens all the time.
So if I go back to what I
was talking on measurement
and data-driven marketing,
we as marketers are gonna focus
on the outcome, not on the
metric that you're capturing.
Right, so some metrics like,
you know, cost per acquisition
et cetra are great directional indicators.
But they don't determine
the value of our investment.
Right, so you need to look
at what are the levers
that you have in order to
drive business outcomes
like customer lifetime value
and the moment that you do that,
you will be challenged on
your attribution models,
you will be challenged
on the shape of your
what are your P&L looks like on a month
on a quarterly basis,
even on a yearly basis.
And at Expedia, we're
starting to make decisions
in those terms, which is, well,
you have one attribution model
that will tell you the
return on your ad spend
and you have a shadow attribution model
that tells you that the mix
might look slightly different
if you start looking at
lifetime value metrics
and not only the total month basis,
but even an 18 or 24 month basis.
And if you're really about delivering
a business growth in a sustainable
and future proven manner,
you cannot be looking
in the short term, you
gotta be being scientific,
utilizing your data and
applying those metrics
that look way beyond that, you know,
monthly quarterly basis.
And that keeps the customer at heart
because the moment that you do that
you're, then customer centricity
because you're really
delivering what the customer
is interested not only today,
but 12 months from now,
24 months from now and
that drives loyalty,
that drives business growth.
- [Naber] Yep, oh, sorry,
did you have something else to add?
- [Gabriel] You're asking
about data-driven marketing
and the one thing that is
important to explain is,
I mean, why data is so critical, right.
You know, the more informed you are
and your organization is,
the more valuable your
hypotheses become, right.
And I was referring to the
scientific method before.
So, data democratization
and the access to it
is fundamental to drive
a test-and-learn culture,
which is what drives that excellence
and duration, at least in our business
and having the infrastructure
and data warehouses
to capture the data is an
important foundation, for sure.
But having the right tools
and the analytics for for different people
to make sense of it.
is obviously needed.
Now what really makes the difference
is acknowledging that unless
every single person
within the organization
have access to the data
and there is a culture that drives
an expectation of everyone
having to experiment with it,
the potential is not
being maximized, right.
So the sooner the data supply
chain can be streamlined
and the data democratized
for the consumer,
the end consumer that could
be your marketing manager,
that could be your product manager,
the sooner that data
become a strategic asset.
So, the test-and-learn
culture at Expedia wouldn't
really exist unless every single person
in the organisation had
access to the data tools,
to the performance dashboard,
to the framework, but I
think most importantly,
they were encouraged to fail all the time
and make data-driven decisions,
knowing that they are
gonna fail, you know,
about 70% of our tests fail.
And unless you go through that journey
and the business is ready
to invest on failing,
then you won't be able
to really, you know,
make the most out of the application
of the scientific method into, you know,
the experimentation through
product and marketing.
You know, core there is
to just do it fast, right.
You go experiment, fail, do it fast, learn
and move on to the next thing.
- [Naber] Excellent and I
wanna go back to something
you said before about long term growth.
Can you give us a bit of a sense
for how that process works for you?
So, how you define what that
long term growth looks like?
What indicators you look for,
for long term growth and
then reverse engineering,
You know, what leading indicators
you look for in order
to get to that place?
Can you walk us through that process
the step, step by step process?
- [Gabriel] Sure, let me walk
you through the process.
And I'll try to sort of
you know, share an example
but let me start with the
process, first of all,
you need to define,
as I said, you start
with defining based on observation.
Coming up with a hypothesis, right.
And once you have your hypotheses
and you've strong enough hypotheses
that you can go and test, then you,
you start the testing process
and in terms of metrics
and what your question
of what does that look
like, in the long term?
It depends on what your
objective is, right.
So if your objective is to drive,
for example, the shape of your of you P&L
would dramatically change
if you drive 90 day repeat
rate of your customers.
And the lifetime value
of your platform mix varies,
just a tangible example.
So customers on Expedia
app are worth more,
repeat more, our most loyal customers
than customers transacting on web.
process-wise you gotta define
that your objective would be
to get a customer to transact on the app.
Hypothesis being, that if
they transact on the app,
they have a higher lifetime value,
because, you know, they have hypotheses.
That is because they repeat
more, they're more engaged.
And there's higher rates
as you go and book a flight.
Then likelihood of probability of booking
a hotel is higher.
So that's a process, right.
So you go to your metric,
which is 90 day repeat rate,
which determines the lifetime value,
you go to your hypothesis in place
and you only test that.
Now what's happening in the
analytics world behind it,
it's, you know, I'm not gonna get
into the details of that.
But that's just you know,
the way that you would set
up your attribution model
to look at the click stream and look at
if a customer clicks on a marketing link,
most likely on Google.
It will cost you money and where does
that customer click before he transact?
How much of those clicks can you save,
if that customer goes
directly to your platform?
How does that impact your
your cost per acquisition?
How does that bring,
you know, up your
what we call marketing contribution.
And if they end up because of that journey
on the app, sorry on
the platform preference,
how then the lifetime value increases.
And you you test that hypothesis.
What are some of the challenges of this?
Well, statistical significance,
reaching statistical
significance when you're,
you know, applying some of these test,
particularly when you're trying
to do it in a small region,
if you're trying to do
it globally, that's fine.
If you try and do it
in the US, that's fine.
If you try and do it in
Taiwan, it's not so easy.
So, you know, there are techniques
to address those challenges
where you can bundle together,
you know, similar cohorts that could,
you know, comparatively,
allow you to reach
that significance faster or
make certain assumptions.
So, that's the process
that it would take and I tried to describe
the process with a tangible example,
trying to simplify it, right.
But a tangible example of what you come up
with a hypothesis, you have the metric
that you know is the driver
of that lifetime value.
And then what follows after doing that is
or in parallel to that is you need
a predictive model, this
where it gets complex
that will allow you to
tell you that, in fact,
that's gonna be the
value, the lifetime value
of that customer in 12 or 24 months.
- [Naber] Right, there's a lot
of art and science to that.
- [Gabriel] Yes.
And hopefully, if you're
still around 12 months later,
then see how accurate
your predictive model was.
And once you have that 12 month corhot
with a predictive model,
sort of stress testing it,
you adjust the predictive model
and then life becomes a lot smoother.
