For many years now
it seems like we've
been engaged in a debate
that questions and challenges
the relative importance that
we place on science in society,
and that which we place on arts.
And the idea behind this
session came about really,
from a feeling that
there was a reflection
of that arts versus
science debate happening
within the discipline of
public and social innovation.
So on the one hand,
we see the rise
of science-based
approaches, that
is systematic experimentation,
such as randomised control
trials, which is seeking to
understand how the world works,
rigorously testing hypotheses
in controlled environments.
And a spectrum of that
work is quite wide,
but it has some shared
common characteristics
around systematic,
being logical,
having to pay a great deal of
attention to data and a quest
for empirical evidence.
And then on the other hand, you
have more design-based methods.
And those are focused
on creativity,
origination, coproduction,
applying design principles
and techniques, such as
creative ideation processes,
prototyping and visualisation.
And those methods
are not asking,
how does the world work?
But what's needed
to solve a problem.
What doesn't exist
yet that should exist
and how could we
make that thing?
Now as innovation labs
multiply across the world,
so too do the adoption
of these methods,
and therefore it feels
quite important really,
to apply a critical gaze.
And when you look at these
view and methods together,
it raises a number of
quite intriguing questions.
So where do good ideas
really come from?
What counts as evidence and
what doesn't count as evidence?
What counts as scientific?
How do we judge the
relative successes
of these different
methods when we're
looking at sustained,
transformative innovation
over just incremental change.
And are we really looking at
a suite of tools from which we
can collectively
pick and choose,
depending on the
context and the goal?
Or do they really represent
some very different
underlying philosophies
about the world and therefore
differences about how
to support innovation?
Regardless of
approach, are actually
we still left with the
perennial challenges of scale
and diffusion or
does one method set
the conditions more favourably
for scale than the other?
And if working in this,
indeed working in any field,
tells us that there's nothing
as permanent as change,
are these approaches
that are in vogue today,
going to be the approaches
that are in vogue tomorrow?
And if not, just what it is
that stands on their shoulders
and to what extent can
we predict and shape
those methods ourselves
through experimentation
and through dialogue?
So, to explore that
and more, I hope.
I'm joined by David Halpern
and Christian Bason.
Both David and Christian's
work will be well known to you,
I'm sure, as both have really
advanced our understanding,
knowledge, and
methods for supporting
public and social innovation,
and yet, seemingly
come from two quite
different places.
So David, founder and chief
executive of the Behavioural
Insights team that you heard
about a little while ago,
former chief analyst
in the Prime Minister's
Strategy Unit, founding
director of the Institute
for Government.
Before entering government,
David held tenure
at Cambridge and at posts
at Harvard and Cambridge
and he's the author
of a number of books,
including Social Capital
and the Wealth of Nations.
Christian Bason chief executive
of the Danish Design Centre,
also known through books
and other extensive writings
on the subject of design,
innovation, and management,
in the private
and public sector.
And many of you will know him,
I'm sure from his former role,
leading the Danish
Innovation Agency
MindLab where for
years, he worked
on developing and
conveying design
methods both within Denmark
and internationally.
So, as leaders in your
fields, I wanted to thank you
for coming here today.
I want to ask each
of you in turn
to present some thoughts to us,
and we will take it from there.
So perhaps Christian,
you could join us first.
Thank you.
[APPLAUSE]
Last December I was
doing a workshop
for newly appointed directors
in the European Union.
And the title of the workshop,
or the training session,
was I think,
Innovation and Design.
And when I came in
the door, and it
was a group of about
25 people, looked at me
and started laughing a
little bit, and one of them
said so, where's the chair?
Where's the beautiful
Scandinavian Danish design
chair that you were supposed
to bring for the session?
And I said, OK, I guess
this is the starting point.
Let's set the start to talk
about what design actually us.
And so, I guess
that's what we're
going to be exploring a
little bit this morning.
So first of all,
thanks for having me.
And it's really amazing
to see a lot of people
that I've also worked
with in various ways.
It's like looking at a global
and emerging and community
of labs.
So it's really great to
be here just to say that.
And also to say that, one
thing that Helen didn't mention
is that my background
is in political science.
And that I worked for 10 years
in a consultancy in Denmark,
building a practise
around evaluation,
randomised control trials,
performance management,
and I also helped start up
this first National Evaluation
Society in Denmark.
So, why is it that I've
then left public policy
or performance
management and evaluation
and started to embrace design?
What's up with that?
And also why have I now taken
up the role as chief executive
of the Danish Design
Centre which has a mission
to strengthen the use of design
in business and in governments
and to professionalise,
help professionalise,
the design profession
in Denmark specifically.
And I would say that
why I've done that,
and why I'm so interested in
the design is because I've seen
personally and also
organizationally what it can do
to transform public
organisations and private
organisations and how it can
change people's lives in ways
that I have not seen a
randomised control trial do.
Or a beautiful 250 page
evaluation and I've
done a lot of 250 page
evaluations in my time.
So just very briefly
on what is design,
because it was mentioned
a few times already.
And as you can imagine, I would
say it's a little bit more
than a chair, a beautiful chair.
But design of course,
is about giving form.
It's really the knowledge space
about how do we give form?
How do we create, you could
also say, desirable futures?
And just to say one of the most
famous designers of chairs,
Charles Eames, some
of you might know him.
Charles and Ray
Eames, a couple that
have designed a lot of
beautiful, beautiful chairs.
Some many of you may even
own one, an Eames chair.
He was once asked,
what is design?
And that's when you
get really curious.
What would a designer say
to define his own profession
or her own profession?
And Charles Eames
said the following,
design is the process
of arranging elements
in such a way to achieve
a particular purpose.
Arranging elements in
such a way to achieve
a particular purpose, that
was his definition of design.
So if you're a policy maker
or leader in government
or in a social organisation,
and you think about your role
in changing and
transforming organisations,
isn't that really about taking
elements such as finance,
services, processes, regulation,
and rearranging those elements
in ways that achieve a better
purpose, a better intent,
better outcomes for
people, for society.
So that is, I think, a very,
very broad starting point
for design.
But what it really does say
is that it's future oriented.
I would also say that
design in many ways
is about systematic creativity.
And I tweeted earlier,
a blogger wrote
about different
dimensions of design,
but you can say that's
certainly design is folded out
between arts and science.
So to give an example,
in Denmark the other day,
I was at the commencement
services for the Royal Danish
Academy of Design and
Architecture and Conservation.
That's a very artistically
oriented design education.
And you can just, the way
that the students dress,
you can see that these are
not really business people.
And they may not
even thrive too well
in the business environment.
I don't know.
But they're very artistic,
very sensitive, very concerned
with how human beings
experience surroundings.
They're concerned with
materials, with textile,
with colour, and are extremely
skilled in working with
materials to create particular
expressions and particular
futures.
But on the other hand, in
Denmark we also have the Danish
Technological University, , the
University of Science where you
have a design education
called design and innovation.
And where basically we
produce engineers who
are good at engineering design.
And between those
two spaces, you also
have master's degrees
in service design.
And you have some
professionals who
are working on how do
you transform services,
both physical and digital.
So think about the processes
pf arranging elements
that those processes can
deal with everything,
from graphics to products to
services and really to systems.
And so what you're doing, if
you are in any way engaged
with creating futures, or
creating some kind of change,
you are actually
engaging in design.
The question is, are
you any good at it?
And so what is the value
of design practise?
And let me just say
a few things also
to sort of set up for
a bit up for David here
in our little science versus art
battle that's happening here.
So first of all, I would
say that design helps us
with imagination,
with discovery.
So as Helen said, where
do ideas come from?
If you set up a randomised
control trial between one
set [? sort of ?]
part of the solution
and another kind of solution.
Ideas have to come
from somewhere.
They don't come out of the blue.
And designers and
other creatives
are really good at coming up
with ideas in the first place.
And also, often the way you
are inspired to get new ideas
is by conducting, for
example, design research,
ethnography, coming up close,
doing qualitative observation
studies, participant
observation,
to get ideas about what might be
solutions in a certain context.
So that's first point.
I have four points about what
design can help with here.
The second point is to
give the form and the shape
to the kinds of
solutions we actually
want to put into society.
So when David's team is drawing
up a new letter or new text
message or a new
digital service,
that's a design process.
And how do then shape, give
form and colour and structure
to those artefacts,
those physical things,
you put into the world.
That's of course also
why you need design.
Thirdly, the holy grail of
what all of you are working on
is that word, scaling, isn't it?
Diffusion, growth,
scale, how do we
take small
experimental solutions
to national scale
or global scale.
And I don't believe that
scale can happen just
through evidence.
Telling people,
this worked here.
You should probably
do the same here.
In Denmark, on average
it takes 17 years
for a proven medical
procedure in health care,
17 years before it's scaled
across the Danish health care
system, even though
it's evidence based.
And that's in a profession
where you kill people if you
don't do the right procedure.
It's a profession where doctors
and nurses are kind of engaged
in trying to help
people, and still it
takes them 17 years to take on
the [? best of best ?] methods.
So what does it
take to go to scale?
Well, what it takes
to go to scale
is that you yourself
are part of the design
process as a professional, as
a social worker, as a nurse,
as a teacher.
There has to be an element
of invention and creativity
involved.
It has to make sense
to you in your context.
So you need, yes,
you need evidence,
but you also need a creative
process of understanding how
that evidence can fit into
your context in your setting,
in your organisation.
Finally, a final word on the
fourth thing that design can do
is to introduce
something different
than rational and logic
and analytical in society
and for citizens.
What about magic?
What about enchantment?
What about experience?
Public services are not
about just what works.
It's about how is it experienced
to be a child in a school?
How is it experienced
to be a patient?
How is it experienced to travel
through the health care system?
And that's empathy and
human understanding
of what happens when we
encounter public services.
It's not fully rational.
It's also very, very
subjective and that's
what design can help us with.
Final 30 seconds on two
critiques of design,
just to say that as a
political scientist, of course,
I'm not a designer.
And the things the designers
are not that good at,
and you should be critical of.
On is designers would tend
to try to invent something
new every time.
But there is evidence
out there, there
are things that work out
there, so of course, designers
should also be aware of
what's already been done
and what seems to work.
And the second critique I
would say is that as much
as designers are skilled
at emphasising with users
and citizens, they're not
always so good emphasising with
organisations or
with bureaucracy.
And that goes for business
and public sector as well.
So designers may need to
develop their empathy skills
a little bit with understanding
and emphasising with systems.
Thank you so much.
[APPLAUSE]
Thank you Christian.
David, would you like to take--
I was very tempted to
wear a white coat today.
I should have really done that.
But let me at least-- what
I though I'd try and do
is three things.
What I was going to kind
of to get across today.
One is I want to talk about
failures, for a bit, which
we don't talk enough about.
Second, I was going to talk a
bit about essentially where--
actually I do have quite
a strong view, which
is that with a kind
of fashion for labs
and it's cool in here's these
kind of chemical structures.
And you can see a
little beaker there,
a glass beaker
against [? Gav ?] lab.
It's actually we've quite
often lost the essence
of what a lab is, which
has got built into it
experimentation,
really important.
But the final
thing, in fact I was
going to start with
is, I thought I'd talk
about some organic chemistry.
I thought it would be good
for you to learn some today.
But I should ask,
how many of you
do actually have a sort of
harder science background?
Very good.
Oh, lots of you.
Very good.
So you can correct me
if I get this wrong.
I'm a bit rusty on my
ole-- I started life
as a natural scientist
because of Cambridge.
So I'd thought
start with a story
from organic chemistry, which
is about benzene, for those
of you who don't know.
So benzene was originally--
benzene has been known
about roughly 500 odd years.
Exactly what it was, in
fact was Faraday, in 1825,
who first, here in
the UK, I should say,
actually managed to isolate
it in its pure form.
But we didn't really
know what it was.
He didn't know what
the formula was.
And it was more than 40 years
before a paper was published.
[INAUDIBLE] OK, cool.
Suggesting what he thought
the structure of benzene was.
Don't know how many of you
remember this from your school
chemistry, but here we are.
This is the formula you'll
have learned at your school.
Right.
So it's got six
carbons, six hydrogens,
it's very simple
you might think.
But it turns out it's
really hard to figure out
what the formula is.
What does it actually look like?
So Kekule published
his paper in 1865,
suggesting that in
fact it was a ring.
And he subsequently
told the story about,
how did he come
up with this idea?
And so, just I'll
tell you actually
what the structure
is, just so you know.
So you-- little learning point.
Here we go.
It's like this.
Now there, you've got your--
your carbons are all around
in the ring.
And they've all got a
little hydrogen on them.
Anyway it's kind obvious
afterwards, it turns out.
But it wasn't too
obvious at the time.
And Kekule had the
idea from a dream.
He had a dream in
which he saw a snake.
And the snake bit
its own tail, 1862.
And it gave him the idea.
He'd been working on it for
seven years, that actually it
might be a circular structure
and that the carbon was
wrapped around on itself.
And it actually
took a long time.
In fact it wasn't until
1929 that it was finally
confirmed this was the
structure of benzene.
But the point about it, why
I'm telling you this story
is that it was actually a very
creative, crazy thing, the idea
that he had this
dream of the snake.
And it turned out
that he was right.
It took a long time to
figure out that he was right.
And that's why a lot
of us know the story,
and if you know anything
about organic chemistry,
you might know this
particular story.
But he could've been wrong.
And actually he really
could have been wrong.
And quite often,
people were wrong,
which is why I want to
talk about failures.
So lots of things
that people think
are a good idea, whether
they come up from a dream
or wherever it is, where
do ideas come from?
But they just very often
turn out to be wrong.
And so it's really important
for us to remind ourselves.
Particularly since we are
all subject to optimism bias,
particular about our own ideas,
which always seem really good
compared to everybody else's.
So lots of ideas
turn out to be wrong.
For just an example
close at hand,
a year ago I was reading in the
"Evening Standard," someone,
a London borough had
got this fantastic idea.
You'll recognise it as
soon as I describe it.
It was a big picture of a
kid in cuffs, handcuffs,
and the question was, they were
going to be taking school kids
and they were sticking them
in the back of a police van
and locking them up and taking
them to a prison around.
And it was going to show
them that you don't want
to go down a life of crime.
And in fact, there was a
comment from the Minister
as well, saying this
is a great idea.
Great idea.
It was such a great idea,
of course, a lot of people
have had this idea across
the world, lots and lots
of times, such as,
"Scared Straight"
can be delivered for a few
dollars a time in the US.
It's too bad that actually,
in fact, almost exactly a year
before this article in
the "Evening Standard,"
there had been a
systematic review,
showing that's "Scared
Straight" makes
kids more likely to offend,
by quite a big margin, by 60%.
So this is true in
lots and lots of areas.
Lots of good ideas don't work.
Really important for
us to understand this.
It's true in many areas.
It's true in medicine.
It's true in medicine.
Sometimes the phrase is
used by medics, who say,
the parachute test.
We shouldn't test
parachutes systematically.
Let's chuck 10 people out
of a plane with and 10
without a parachute and
let's see what happens.
And they use this sometimes
to say, of course,
you shouldn't do a trial on this
particular medical technique.
It's obvious that it works.
It just turns out that
actually quite often things
which people are
convinced would work,
effective, when we
finally do do a trial,
it turns out they often were
in fact, killing people.
And even at BIT, let
me just share with you,
we've had lots of
things that didn't work.
We've had lots of
things that did work
and it's been a
great success, but we
have had examples of
things that didn't work.
In a recent example that
will be published shortly,
there's a great
idea, actually partly
from lab-based
work, funny enough,
is if we show people pictures
of their home in infrared,
and you can see all
the heat leaking out,
you'll be much more likely
to go and insulate your home.
So we ran a trial,
sending people letters
and it had a picture of homes.
And we matched the home
as closely as we could.
And you could see all
the heat leaking out.
It made them much
less likely, it
turned out, to then go
on and get installation.
So what's the point about this?
It's that all we
need to do is, we
need to marry the
creative process, which
of course, has got a place,
with systematic testing.
We need to have that kind of
humility about what we do.
And we are trying to do
that on an increasing scale,
even not just about
Behavioural Insights.
So in the UK, we have the
Educational Endowment Fund
run more than 90 large scale
studies in the last few years,
87 of which are randomised
control trials, involving
more than half a million kids
and 4,000 schools in the UK.
And lots of things
they looked at
have turned out not
to be very effective.
So it's really important
that we do that in everything
that we do.
And I wanted to kind of
show you, if you like,
it's a dirty secret.
I've got one last diagram.
Because actually Christian and
I are great friends and so on.
And really we believe that these
things can be pulled together.
But I'll just show you
mnemonic, which we sometimes
use ourselves.
So you all know
the design diamond.
You know something like
the double diamond,
that whole creative flow.
It's not quite as pretty as
benzene, but it's you know.
And we overlay on it
a simple mnemonic.
Right?
T-E-S-T. Test.
So what's it stand for?
First of all, clarity.
What is your target?
What you trying to answer?
What is the question?
Often we're not clear
enough about that.
And we also try and ask
early on, what's your data?
What's your measure?
What data's there?
If you ask early on,
it's really powerful.
We then go into explore,
yes, be creative.
Think about the solutions.
Think about what the
evidence is telling you.
Fine, that's the upswing
of the design diamond.
And then we drive into
what's the solutions?
Prototyping, not so different.
As you start to narrow
down [? into. ?]
And then finally, we
conclude with the T is trial.
Run a trial.
Figure out whether
it's actually true.
Did it work?
So I mean, that's kind of my
main [? axiom. ?] Obviously,
you go along pretty [? ultra ?].
I feel very strongly
we do need to marry
into innovative approaches, this
kind of systematic approach.
And I'll just leave you
one final physicist,
in order to complete the
slightly hamming it up
on the-- it's Feynman,
Richard Feynman, fantastically
creative scientist
who did quantum
chromodynamics, for those people
interested in what he actually
did.
He also wrote a lot
of accessible books
and about lots of subjects.
And his main point that
he would urge people,
not only in the scientific
community, but more
generally, essentially,
was to embrace humility.
Embrace humility,
that is the essence
of the scientific method, is
that constant self-doubt about,
is it true what you're saying?
And if we can do that, if we can
marry that into our innovation
approaches, I think
we get something
which is much more true to
the original lab conception,
and actually is much more likely
to be effective over time.
Thank you.
Great.
[APPLAUSE]
Thanks, David.
I think, yeah,
the idea of having
a plurality of approaches
is kind of core
to what this session is
about, although we're
sort of pitting
you a little bit,
I think there's clearly
a deep appreciation
of each other's work.
I just wanted to
ask you a question,
whether you come back to
that, because there are some
criticisms of design not
paying attention to the rigour
of evaluation and testing.
How do you approach it?
Yeah, so as I
mentioned one issue
is, when you start
on a design brief,
many designers would
just start by doing.
And that concreteness and
that specificity of saying,
let's just do something.
Let's try something out,
is of course, a strength,
but it's also the
weakness that you
don't start with a
major desk research
on what is the available
evidence of best practises
and what works.
So I think there's something
going maybe in parallel.
At the same time as you begin
to work on a brief, to say,
well, what do we
actually know already?
And then of course, this
exploration of context.
Because just because we know
that this programme didn't work
in California or did
work in California,
who says it's going
to work in Copenhagen.
So I think that's-- and you can
say that humility of a designer
to say, well, somebody may have
actually explored this before.
I mean, no problems are
entirely new somehow.
And I think the other one which
is clear in David's slightly
boxy design model, is that
once you arrive at a point
where you have some
options-- and design
is all about creating options
for decision-making, right?
Then clearly there
is a point where
you would like to, more
systematically test and try
out, maybe even at scale, to get
statistically founded evidence
of what seems to work.
I would say that one important
thing is that designers
experiment all the time.
The designer leeway
of arranging elements
is not the analytical way.
It is the experimental way,
which means it's prototyping,
it's testing out, it's trying
out what makes sense to users.
It's throwing sketches
and throwing models out
and trying new ones.
So I would say that inherently
in the design process, driven
by professional designers, it
is an extremely experimenting
process, and also a lot more
messy than this beautiful
linear model here.
But just to say,
I think there is
a curiosity and experimentation,
again, on whether it's
a possibility of a
marriage in a sense,
between a designer's
way of working
and the way of working
on more evidence- based.
And just before we go out
and have some questions,
because I'm sure there
are lots in the audience,
I had sort of a question
about how you think
about people in all of this.
So it might be too
simplistic, but it
feels like your science-based
methods are more about research
on people, so at a distance,
objective analysis of what's
going on.
And with the design
methods, it feels
like it's more like
research with people,
if we can call it that, sort
of more cooperative inquiry,
action research-based.
Now, that might sort of
too simplistic a kind
of divergence, but I do wonder
about how then that affects
scaling of innovation.
Because it feels like
you've got two quite--
I don't think it's
a matter of style.
I think it's something
quite fundamental
that sits underneath all that.
David.
Well, the Behavioural
Insight work
is definitely-- of
course, by the way,
yes, we do a lot effectively
kind of method acting
approach to policy.
Like if you're going to
look at the energy markets,
let's actually try and
change supplier or whatever
and see what it
looks like and feels
like before you talk
to the PM about it.
So we do do a lot of that.
We do a lot of spending
time with people,
trying to sound a context.
But you're right, there is a
subtle difference, which Is not
true for all
lab-based work, but we
spend a long time, of course,
in the behavioural and
psychological literature.
And it makes us a little
bit more sceptical
about what people say.
So the accounts that they
give about why they do things,
we tend to be
slightly distrustful,
which is one of the reasons, of
course, why it drives us back
to a strong form of empiricism.
Because if you ask someone,
if we wrote you a letter which
said, most people pay
their tax on time,
do you think it would
affect your behaviour?
They might well say no.
They certainly
will say, you know,
do you think the
plate size you eat off
affects how much you eat?
Many will say, of course not.
Of course, it hugely
affects how much you eat.
So you only establish that
through empirical observation.
We in turn, though,
do you think you
have to marry it with
some sense of you
ask public commission
about when you
start to use these approaches.
But there is a certain wariness
we have, but I think it also,
I hope, brings with it a deep
sympathy and understanding
of like people are phenomenally
complicated and interesting
sort of subject of study
in their own right.
So, one thing that no one has
mentioned yet this morning,
but that's missing,
is leadership.
It's the role of the
decision-makers in government
that ultimately
make the decisions,
whether it's child
services, whether it's
energy policy, whatever it is.
And in my experience,
and this is
some of the also the
doctoral research
I've been doing over
the last three years,
is that the discovery at a
very, very personal level, even
an emotional level, as a
leader, as a decision-maker
in government of
what is right to do.
What should we be doing?
The insight into policy
programmes that are miserably
failing to address problems,
miserably failing to help
families and people and so on
and understanding at a very,
very personal, fine-grained
level, how we need to change.
That's been missing from
the conversation so far.
And the only way that I can
see that I've seen that happen
is through insights.
It's insights into what
are we doing today?
How might we change it?
And it is really
supporting and facilitating
a process, engaging
people in discovery,
where leaders and
managers and their teams
discover for themselves
what they should be doing.
And this is not a plug
for my new book, which
will be called "Leading Public
Design," 2016, Policy Press,
but it is to say that there's
a missing dimension which
is also a people dimension.
We have to remember, there
are people making decisions
every single day
in government that
have huge consequences
including PMs.
Maybe not them so much, but the
others have big consequences.
And we have to remember that
what we're dealing with here
is influencing decision-making.
Remember everything we talk
about today is ultimately
about influencing
decision-making
about the allocation of
resources in our societies.
And we have to take a look at
who are those decision-makers
and what are we doing
to influence them?
And it takes more than
statistics and more than
convincing numbers to
convince people to change
their behaviour.
Great.
Thank you, Christian.
We've got a couple
of minutes left
and I want to ask if there
are any questions that we
want to pose?
Yes.
Do you want to say your
organisation and your name.
So, thank you very much times.
I'm Samir Doshi from USAID.
And I had a question kind of
trying to link up the humility
factor with the discussion
around evidence and evaluation.
Normally, as you've discussed,
the evidence is kind of goes
to external accountability, but
I'm wondering about internal
accountability and how you would
practise aspects of reflexivity
in your ongoing processing.
So it's a question,
whether an RCT
is appropriate in
the first place,
rather than just going
to a narrative that
strongly established.
How and when do we pick?
It's really a question of how
do we understand the world,
isn't it?
And what is-- a question
of what is true?
And what I've experienced
personally and in my research,
is that leaders who
see for themselves
three or four examples,
cases, rich examples
of how citizens experience
a process that they
are responsible for, as it
fails, that's enough for them.
That's truth enough for them.
That's evidence enough for
them to begin to change.
They immediately start
in the experience
of what's going on, to think
about new ideas, new solutions.
So I'll would say that
that's an epistemology
is a way of
understanding the world
or gaining knowledge
about the world.
It's very, very powerful
and transforming.
And whether the
other epistemology,
the way of working with
statistic and significance
tests and so on, is sufficient
to change the world,
I'm not so sure.
I would say, if you
want reflective,
inquiring organisations
and leaders,
you definitely want the
qualitative piece in there
as well.
One final question for David.
Chloe [? Serovich ?]
from Teach First.
Excuse my voice.
Hopefully you can hear me.
A question for David.
So, you talked about the
importance of failure
and recognising that things
don't always go the way
you'd intended.
And I guess it links
to accountability.
So how do you instil a
culture of kind of risk taking
and acceptance of failure
in government and kind
of decision-making levels so
that you're able to have that
kind of testing approach to
design as needed to work out
what works and what doesn't?
Yeah, so we spend quite a
lot of time with my, actually
with my what works hat on.
The key issue often
with the departments,
if you start talking
about what works,
and they say come and
meet our analysts.
Well , actually that's
not really the concern.
The primary focus is
often the policymakers.
It implies a different
way of doing policies.
Sometimes we call a
radical incrementalism,
which is the nerve to be able
to say to a minister, actually,
we don't know.
Honestly, we don't really know.
Christian's got
some great ideas,
there's five or
six of them here,
but we really know if
any of them will work.
But the good news is,
we can try them out.
We don't have to go
for a single solution.
We can try multiple options
and we can test and see.
So there was lots of
discussion yesterday
in the budget around
the welfare reforms.
If you talk to the
welfare minister,
he'll say privately,
there are hundreds,
literally hundreds
of decisions that
are about built into
legislation on any given area.
If you do a bit of
maths for a second,
you say, look, 300 decisions
to the power of two,
you're going to get more than
the atoms in the universe type
thing.
So you have no idea whether
you're making the right choice.
But the good thing is,
built into the legislation
has been explicit how to
experiment and test variations.
And you do get resistance and
people say, is that ethical?
Oh, my god, you're
going to do experiments.
It's like, well, how else will
we ever know if something-- so
is it ethical to not find out?
I just have one last comment,
which particularly, we've
got a very international
group here.
I think there's one-
and Geoff and I have
been talking a bit about this.
One of the bits of
apparatus that's
built in the kind of traditional
science areas, there's
lots of institutional
practises and platforms
that were developed.
And one of things is to find
out what did people already
know on this before I
stumbled into this area?
And we're not very good
in terms of our platforms,
to do that in relation
to much policy.
So as we build the What Works
Centres in UK and in the US
and elsewhere, there I
think, is a real need
to create a much more
cross-national platform, where
you can easily find out what
did someone already know.
Did someone already
do this review,
for us to collaborate
and coordinate,
at least on the ground clearing
evidentiary review work.
I know Jim's here from
the US and Bloomberg.
I think they're playing
a key role in the US.
So I'd love that to be one
of the things that comes out
in this community, to make it
easier for us to figure out
what did and didn't
work before would
be an enormously valuable
asset for us all to build.
Hugely valuable for everyone in
this room, I think and beyond.
And so, we've
scratched the surface,
but our time is
up unfortunately.
I want to thank David and
Christian for showing us,
actually these are
not particularly
opposing methods but actually
huge amounts of complementarity
and respect amongst it.
Thank you very much.
[APPLAUSE]
