Hello my name is James Mulligan I am a
product manager within QuantumBlack.
This is the first in a series of
episodes where we will be discussing how
we have approached the codification of
our ways of working, how we have driven
their adoption at scale across the
organisation, and the benefits we are
ultimately now reaping due to consistent
ways of working on all our projects.
In this first episode we will be talking
about protocols which is the way we
capture best practices for executing
advanced analytics. When we kicked off
our internal initiative to start
codifying our ways of working and best
practices we looked externally for
signals on how we could think about
doing that. We took a huge amount of
inspiration from our time in Formula One
where there's no better example of
continual improvement than a pitstop team.
In the 50's a pitstop took well over
a minute to complete which doesn't sound
very long but the current record is
phenomenally less than two seconds.
For us this was an amazing demonstration of
the harmonious relationship between
people, process, and technology and we
wanted to ensure that inspired our
codification also. So, what learnings can
be gleaned from a pitstop team that can
be applied to the way we think about
delivering advanced analytics? Well both
involve multidisciplinary teams and
everyone needs to understand their role
to work effectively together, if a
process can be codified it can be
repeated if it can be repeated it can be
practiced again and again to unlock
competitive advantage. The technology
leveraged by that team must support that
process and also be continually improved
over time to drive performance
improvement. For us a protocol also
explains the communication between these
three capability pillars. A protocol
ensures the process is defined by the
people, by the team, ensures that the
technology enables that process and also
ensures that the people choose to use
that technology. This is how we describe
a protocol at QuantumBlack. So what does
a protocol look like on advanced analytics?
Everything starts with a
question, for example as a Data Engineer
how do I understand and document an
organisation's data landscape? This is a
question we'd asked on
pretty much any project, we have to go in
there, understand what's there figure out
what we can use and then play that back
to stakeholders. So our Data Engineers
went away and codified what best
practice looked like for this question,
they defined how we should interview
data owners and subject matter experts,
the questions we should ask and
consolidate all of this into a workshop
guide that could be used by all other
Data Engineers across all of our other
offices. Then they defined how to codify
the answers to those questions into a
data dictionary, a single source of
truth explaining the matter information
across data sources we might use on an
analytics engagement. on the right here
you can see a tool that we've created
called Studio which enables the team to
codify these data dictionaries but also
enable data owners and subject matter
experts to come in and validate that
information.
Finally our Data Engineers codified how
to play that information back to an
organisation. Here we can see a
visualisation of a data landscape and
what we're communicating here to
stakeholders is the information and data
that we need to be able to perform our
analyses. This makes it clear to an
organisation what we're going to use, and
what we're not going to use and where
there might be issues with quality that
are preventing us from validating
specific hypotheses to the problem.
Returning to the three key capability
pillars that we discussed earlier, from
this example we can see that data
engineering define the process for
understanding data landscapes. We can see
that we've built technology Studio to
enable that process and because that
technology provides a tangible benefit
to teams in terms of visualisations and
communication they choose to use it on
their projects. People, process, technology - 
a protocol for understanding and
documenting data landscapes. But how do
we know which protocols we should invest
time and energy in codifying? We
use three criteria to determine where we
should invest our time and energy
the first is frequency, so in our example
earlier from data engineering we know
that pretty much on every project we'll
have to understand and document a data
landscape so it's worth investing energy
in codifying a good answer to that
question. The second criteria is
experience do we as QuantumBlack have
the historical
experience to give a best-practice
answer to that question? We weren't
always that doesn't necessarily mean we
shy away from questions that we can't
answer especially if it's at the cutting
edge of data science and advanced
analytics, but it doesn't form whether we
can invest time and energy in doing a
good job. And third, we take a signal from
the learning ceremonies run by the teams
from their retrospectives, their project
wash ups. Are they experiencing pain in a
specific area on a project? Can we codify
best practices to alleviate that pain?
All three of these criteria determine
whether we should spend time and money
investing in codifying best practice
protocols. We've done it across hundreds
of questions now with in advanced
analytics projects this journey
ultimately will never end if we strive
towards continual improvement there will
always be questions to answer, and there
will always be better answers to those
questions. To date, we've codified over
150,000 words of best practice for
delivering advanced analytics projects,
all of which that is accessible to both
our teams internally but also the
organisations we serve from day one of a
project. In the next episode, we'll be
talking about our 5i framework the 5i
framework is our way of articulating the
end-to-end lifecycle of an engagement.
I'll also be talking about how our
protocols are best practices fit into
that framework. Thank you very much.
