[MUSIC].
>> Welcome to another wonderful
Power BI customer session
at the Microsoft Business
Application Summit.
My name is Lauren Faber,
and I'm a member of the Power
BI Customer Advisory Team.
I'm really excited to have
Kasper from Arla Foods
here with us today.
Kasper will be talking
about enabling Arla
with big data and
self-service analytics.
Kasper, if you could go ahead and
introduce yourself and your role,
a little bit more about Arla Foods,
and tell us about what we'll
be talking about today.
>> Yes, of course, I'll do that.
I'll jump straight into
the intro to Arla.
Just a little bit about the company.
It's about 10,000 owners,
9,900 owners, meaning
that it's a cooperative.
The farmers who supply
the milk to Arla that we
produce our products from,
they're also the owners of Arla.
Of course, it's important to us that
we make the most out of the
milk that we get into Arla.
In terms of milk, it's about 13.9
billion kilos of milk that
we get in on a yearly basis,
and that makes us the fourth
largest dairy in the world with
19,000 colleagues across the globe.
Our products are produced across
four major brands known all
over the world pretty much.
The big ones are here: the Arla
brand, of course, Castello,
mostly known for cheeses,
Puck in the Middle East, and Lurpak.
In terms of revenues, it's about
10.4 billion euros in revenue.
The products that I mentioned before,
it's across three major
categories: it's milk,
yogurts, powders, and cooking.
The biggest category that we have,
it's across cheeses and of
course, butter and spreads.
Those are the major categories
where we produce products in,
and most of the products are
known all over the world,
pretty much branded differently.
Arla has a vision, and
the vision is to create
the future of dairy,
to bring health and inspiration
to the world naturally.
You'll also see that
in this presentation,
this bit about creating the
future of dairy is important.
Of course, we also have
an ambition to grow.
We expect all our owners
to deliver more milk,
and therefore we also need
to grow in terms of how
many products we produce as
more milk into Arla also results
in more products coming out.
In terms of myself,
I'm a manager in Arla IT,
I'm sitting in the
Arla IT department,
in the solutions department,
where we develop new
applications internally in Arla.
I've been with Arla for two
years and my responsibility
is everything has got to do
with data and analytics,
and creating one data
foundation for Arla.
The department I'm
responsible for is known
as the analytics powerhouse in Arla.
We're about 70 people and we
do everything that's got to
do with the development of analytics
to support our data agenda.
We've divided it into,
if you look at the
slide on the left side,
these are the areas
that we work with.
It's self-service analytics,
this is where Power BI comes in.
It's our standard reporting,
financial reporting,
we're using dominantly SAP BW,
and then what we call
advanced analytics,
which is where we're building
more advanced applications.
Again, mostly of this is
in top of Microsoft Azure
and with Power BI as
the front-end tool.
At the core of all of this is our
data foundation and our vision,
which is basically enabling
analytics everywhere in Arla.
Moving on to the challenge
that we've had and
that we're on our way to solving
using Azure and Power BI.
It's probably a challenge that
many other companies have.
We want to enable Arla with big
data and self-service analytics,
and we also want to scale this.
We introduced Power BI
across the organization,
which really put a big tick
in the enabling self-service.
But at the same time,
we then introduced
a big challenge in terms of scaling
this at an enterprise level.
A lot of our applications
were built in silos.
We have approximately 300 Power BI
developers in the organization.
A lot of the reporting that
was built using Power BI,
it was not built entirely on,
as I said before,
one data foundation,
but a very siloed approach.
This is one of the main challenges
that we needed to overcome.
The other challenge is that we
have a lot of data in our SAP BW.
In our business warehouse,
a lot of the data that's
coming from our ERP system.
But with scale comes also a
need to bring in other types of
data from our dairies
or external data.
That's also not something
that we could solve with
the platform that we had when
we came into this challenge.
Another thing is, of course,
the complexity of these applications.
They very quickly explode
as we're letting more
developers loosen the data.
There's a challenge
here for us in terms of
maintaining it and supporting
it from an IT point of view.
From the analytics
powerhouse point of view,
it meant that we moved away
from our conventional
way of looking at data.
Basically, on-premise data
storage we only had SAP BW,
and as I mentioned before,
that wasn't enough for
what we wanted to do.
We looked at introducing
Microsoft Azure and create
this data foundation,
which is now very
important for our agenda.
The move we are seeing here
is away from conventional
on-premise data storage onto
more open Cloud-based platforms.
The challenge then
again becomes that we
need different types of skills and we
need new processes in the
organization to manage all of this.
>> Yeah, and you mentioned
the data foundation.
I know that's a huge
part of your solution.
Could you expand a
little bit more on that?
>> I can. If we take it
from a super high level,
this is what it looks like.
It looks very simple.
We're ingesting data in
the bottom from across the
organization and also external.
All data that's coming in is
basically going through
the data foundation.
That means that we only load
data once into the foundation
and it also means that anything
that we build on top of it,
we don't put additional
load on the source systems.
On top of the data foundation is
then where we built our reporting
both in our conventional
Excel-based reporting,
but of course also Power BI.
It's also where we've build
our models for what we call
exploratory data science and also
where we're consuming data with
our applications that we are
then developing internally.
So if I can dive a little
bit deeper into it,
and if I do that with an example that
illustrates some of the challenges
that I mentioned before.
So we actually have a real example.
It's been anonymized here.
We wanted to develop
an application and this was before
we had the data foundation.
We started out with a source
system and we wanted to develop
an application where we can predict
the intake of milk into Arla,
that's super important for Arla
to know how much milk is coming in so
they can plan production
better because it varies.
The cows don't always produce
the same amount of milk.
Very simple, high-level
architectural diagram with
one source system and one application
consuming data from
this source system.
Then another part of the
organization can see
the benefit of the output
that the application is
generating and they also
would like to access
the data and the sources
and so another application
pops up and all of a sudden we
have two sources for
the new application.
Still relatively simple,
but then what happens is
a third area of the business
comes up and they want
to consume data from all of
them and so on and so forth.
This very quickly becomes very
very unmanageable because
of the complexity.
We don't know where the
source of truth is,
and it's got a massive
impact on our source system.
Of course, it's
literally impossible to
monitor and support this entire
solution here, this system.
This is just one of the few examples
that we had and which led us
to take another approach which is
basically our data foundation.
So if we then look
at what it should look like
and what it looks like today,
is again referring back to
the slide that we saw before
under data foundation,
where we take the same applications
that still exist in Arla,
but most of them are now
we've refactored into
this type of setup where
we have one source,
same source as before ingesting
data into the data foundation.
Now of course, more than
just one source to power
all these applications but
just for simplicity here,
one source into the data foundation.
We load once and every application
can consume the same data,
but through the data foundation.
What we're also doing is
ingesting the output from
each application back into
the data foundation so they
can be shared across
the other applications.
Of course, this is a
simplified user world,
but that's the idea behind
our data foundation.
So we don't get these dependencies
between the applications.
We have the single truth
and we're only extracting
data once to minimize the
load on the source system.
Of course, in powers,
we're happy because
we're the ones who have
to maintain all of
this and support it.
So if we then dive even deeper into
the data foundation and we
take a look at what's there,
so we still have our SAP BW on HANA,
which is still a key part
of our entire BI stack.
It's where we have all our
financial data from our ERP system.
But what we have in addition is
a data lake, Microsoft Azure,
where we bring data in and
we move it through different
layers in the data lake.
All the way from raw, we cleanse it,
curate it and we make it
ready for consumption,
for example in Power BI.
So what we end up with having is
a box that looks something
like this with again,
SAP BW combined with
Microsoft Azure which,
it does pose a bit of a
challenge for us in some ways,
but the opportunities,
by far outweigh
the challenges that we have
with having this setup.
Then zooming in a little bit more on
the data foundation and looking
at how does it actually work?
How does the data flow
in the data foundation?
This is what it looks like.
So before the data foundation,
we basically only had the top layer,
our SAP applications
where we were using
data services to ingest
data into SAP BW,
and then we were using Power BI as
our front-end tool where
eventually we were using it.
To begin with, we were
only using Excel.
The challenge that we had
then was that in order to
actually see any data in Power BI,
our developers not sitting in IT,
they were extracting data into
Excel and then from Excel
loading that into Power BI,
which again makes it very
unmanageable for us in
IT with that approach.
So that's also something
that we're going to
solve using this setup here
with the data foundation.
What we've also expanded is
where we're getting data from.
As you can see, it's not just
internal SAP applications,
we're also expanding it to
other types of applications.
We have a lot of warehousing
system, for example.
We have a lot of systems
that's being used in
logistics that are not based
on SAP and all of that data,
we're bringing that in using
Azure Data Factory
into a data lake and
then we can transform
that data and use
it within the data foundation
for final consumption,
most of the cases it's Power BI.
Production data, we can also take
in from most of our
production sites now,
and even from the farm.
If there are sensors on the
farm that we'd like to utilize,
this is becoming increasingly
important with our
Carbon Neutral agenda.
That's also perfectly
possible with this setup.
We haven't moved in
that direction yet,
but everything is prepared for it.
What we're doing is using
Databricks for the
transformation of the data.
If we're doing some of these
more advanced applications,
we're using Python for the models,
but it's always Power BI.
That's the front-end tool.
So that's how the setup
looks and that's basically
how we're solving our problems
from a technical point of view.
Everything that we build now,
we're building using this setup
here and we're also
refactoring some of
our old solutions into this stack
that you see on the screen now.
>> The current challenge that you
had is pretty common for a lot
of companies who have
a lot of different data
sources and inputs of data,
and the solution that
you've been able to come up
with with this data foundation is
super smart and innovative
and it makes a ton of sense.
Do you happen to have a demo today
or some examples of some
of the reports that you've
been able to build?
>> Yes I do. I have some
examples that I can show you.
So just to give you
just a little bit of
insight into what
we're using this data foundation
for without going
into details on some
of them but it's not
just simple reporting.
We are actually using
it, as I said before,
to get back in control of some of
the critical reporting
that's been developed by
some of our users and Power
BI developers outside of IT.
We're using it to provide
global logistics transparency.
This is where we're combining
multiple warehousing systems,
multiple logistic systems, data from
all of those, we're combining that.
Again, we're using Power BI to
visualize all of this data.
The tools that you saw before about
predicting global milk intake,
we're expanding that
using this platform and
the entire platform is
a major contributor to
reducing our carbon emissions,
as we have an ambition
of reducing it by
30 percent by 2030 and being
completely carbon neutral by 2050.
A lot of that is done
using analytics.
This is again where we're
using the platform.
Then the last point here
is very relevant now.
What we were also able to
do is very quickly launch
new applications that we
can support and we can maintain
from an IT point of view.
So I'd like to just very briefly
just show you what we've done.
This is something where
we were impacted quite
severely by COVID-19.
It impacted on demand.
If there's an impact on
demand it means that our
long-term planning of what
we're producing at the dairies,
it's no longer correct.
We also had an impact on capacity.
So that meant that both supply
and demand were all to sink.
We needed something
very very quickly,
very very short-term to be able to
re-plan on the right sites and
on the right production lines.
So we brought in data from
SAP BW and most other places,
but also manual input
from some of the sites.
We used the data foundation for that.
Then we used Power BI to
create a visual interface
that people could use both
on-site but also from a
planning point of view.
So that this we're introducing,
it's been heavily anonymized so it's
a little bit difficult
to see what's there.
But this was something we were
able to do in a matter of days.
If we then look at some
of the bigger solutions
that we're building,
if I go back to this
milk intake solution
that we talked about before,
it's a very good example
of how we're using
Data Foundation to
create new applications.
As [inaudible] mentioned before,
it is really important for us to
know how much milk is coming in.
We've always done this,
try to estimate and forecast it.
The difference is that before,
it was done very much
on experience service.
It was a group of
people who had a lot of
experience with what was
happening in the market,
they had a lot of experience
with what's happening
with the productivity of
the cows during the
different seasons, etc.
They came up with an estimate,
what we are moving towards now and
what you also saw in the very
beginning of the presentation.
There's a definite move towards being
more data-driven within Arla.
So rather than only
relying on experience,
we are now also basing our
forecast on a machine learning
model as a way of
predicting the milk that's
coming in based on a number of
different external factors.
This is where we're
actually using all of
the components in
our Data Foundation.
We're bringing in data,
both internal and external, to Arla.
We're using our data lake
to store the training
data for the model.
We're using Power BI
for the visualization
and what's important for us here is
that it's not the actual interface.
It's not actually developed by IT.
It's someone in the department,
in the milk planning
department who's developed it.
So that it's consuming the same
data that we are building,
that we're using to
train our models on,
that same data that he's
building the interface on.
From our point of view,
we're using Shiny,
which is the frontend application
to R. Again, it's the same data,
but we have different
needs in IT than they
have outside of IT with
this type of data.
But this setup allows us to use
multiple tools and then
to interact with the data using
different types of tools.
What it looks like is fairly simple,
but just before we go into that,
just to go back to
what we said before,
this is actually something
where we changed the business,
we changed the technologies,
we changed the way that we're
using our technologies.
We've had to change the
skills that we have.
Finally, and probably
most importantly,
we've actually changed the
forecast processes from being
experience-driven to being
very much data-driven.
What it looks like is
something like this.
This is the front page, very simple.
You have an overview and it's
possible to go in and
look at the total intake,
the forecasts, and a
number of different KPIs.
Some of these, when
we first started with
this in IT was basically the
forecast that was important.
The rest is something that
people outside of IT.
The core team have
actually done only using
Power BI and simply with
the knowledge that they have
about the needs in the business.
Again, these slides, the numbers have
been taken out so there's
more information on them.
But it is possible then to dive into
the data to drill down
on the countries.
This is showing actual
intake, for example.
What's also important for Arla's to
know the percentage of fat
and protein in the milk.
This will give that overview.
This is actual data that
we're showing here.
If we move on, it's possible
to compare countries.
But most importantly,
with this solution,
we're seeing both the results
from the machine learning model,
the forecasted result,
and the actuals,
which also we'll be
illustrating here,
which is important for the
business and which is what they
use when they're planning production.
That was the example that I just
wanted to share with you here.
>> Yeah, that's awesome.
Thank you so much for
sharing that, Kasper.
It's cool to see how your solution
really has made it so much
easier to make these reports
and allow Arla Foods to be focused
on that data-driven culture.
I see here you have some
key drivers for adoption.
Could you go into a little
bit about those and what
has been able to help
you be successful?
>> I can, yeah. The top one here is
a realization that for us at least,
it's been equally important to
have Data Foundation as it has
been having Power BI in resolving
some of these challenges.
Self-service wasn't just resolved
by introducing Power BI.
It can only really be resolved
once we also are in control
of the Data Foundation,
which is where Azure then comes in.
That's been hugely important for us.
It's also been very important to
have a very strong
technology team who can
come up with both thinking
in the architectural ways,
putting together a good architecture.
We have a very strong
architectural team,
as well as a very strong development
team that we actually hired
in specifically for
this solution here.
Then change management is,
of course, always important.
But when you're doing
something this big,
we're basically changing the way
that we work with data
across all of Arla.
Then it becomes super
important to have
a dedicated change
management function.
This is also what we're
ramping up on now,
both in terms of making sure
that the people use
the tools correctly,
but also that they know
what the opportunities
are with Data Foundation
with Power BI.
It's important to show your
business value right
from the beginning.
We didn't just build a Data
Foundation just to build it.
Every time we built something
in the Data Foundation,
we also built an application on
top of it so that we
could show value.
Governance is very important.
It's important to have a balance
between agility and control.
From an IT point of
view, it's, of course,
very easy if we can
control everything.
But the value isn't
as great, of course.
It's finding this balance
that's important.
Then of course,
an executive team that trust
that we can change and we can
deliver relatively complex
technology projects
has also been very important for us.
I think that's what I
wanted to share with
you here. Thanks, Lauren.
>> Perfect. Thank
you so much, Kasper.
I really appreciate your time
and I know the work that you
put into this presentation.
Thank you so much for your
participation and for being here.
>> Thank you.
