The following content is
provided under a Creative
Commons license.
Your support will help
MIT OpenCourseWare
continue to offer high quality
educational resources for free.
To make a donation or
view additional materials
from hundreds of MIT courses,
visit MIT OpenCourseWare
at ocw.MIT.edu.
RAJESH KASTURIRANGAN: My
name is Rajesh Kasturirangan.
I'm one of the
co-founders of ClimateX,
which is one of the co-sponsors
of this event along with Fossil
Free MIT and many, many other
organizations on the MIT
campus.
We also have non-MIT people
here, several of whom--
some of whom are MIT alums.
So Jeff Warren is
one of our speakers.
Britta Voss, who is on Skype
over there is another speaker.
And then we have Nathan
Phillips and Audrey Schulman.
So we have a really,
really fantastic lineup.
But let me just explain
what we are doing and why.
So ClimateX-- the idea is to
create an open climate learning
platform for the whole world,
but starting with the MIT
community and then
broadening that
to the greater Boston area.
So IAP, as some of you know,
is the Independent Activities
Period at MIT.
And that's the time when we
do all kinds of fun things.
And there are many, many
climate-related courses
being offered, which we
decided why not bring them
under one umbrella and
call it The Climate IAP.
So if you go to
sites.google.com/cliap you will
see all of the courses
that are being organized.
And so that's across
the spectrum, everything
from climate science, to
policy, to energy negotiations
in places like Mexico.
So what we're doing
here is to say
how can citizens
directly take action
which is grounded in science?
And we have actually some
fantastic speakers here today
who have contributed to
that in many different ways.
Our first speaker, who will
be introduced by Britta,
will be Jeff Warren.
And Jeff Warren is one of the
founders of Public Lab, which
does, you could say,
community science
around environmental questions,
both building the hardware that
allows you to sense
environmental variables
and the discussion and
analysis that comes
from collecting that data.
We have Nathan Phillips
and Audrey Schulman,
who have done some
great work together
on gas leaks in the
greater Boston area.
And those gas leaks will be the
focus of not just this session,
but also the next three.
So we have three more
sessions after this one.
There's one on the 23rd,
which is a data hackathon.
So we're going to take a dump
of data from Audrey and Nathan,
and we're going to do really
fantastic things with it.
And then, if you're really
interested in seeing where
these gas leaks are, we're going
to go on a tour on the 30th
across the
Cambridge-Somerville area
and do some gas
sensing on our own.
And once that's
done, we're going
to come back on
the 1st of February
and say, how do we take this
and make that work for us
in the public interest, right?
And so generally,
I think the flow
that we are trying to prototype
here in these four sessions
is that citizens can work with
scientists and policymakers
and others to directly
take charge of the climate
challenges that affect
them wherever they are.
And that by doing so, we can
contribute to climate action,
but climate action that's
grounded in knowledge
and not just pure advocacy.
So I think that's a really
fantastic new opportunity that
did not exist even
a few years ago.
So I'm really, really
happy that we have
a wonderful cast of speakers.
I'm going to turn it
over to Britta Voss
to introduce today's session.
BRITTA VOSS: Great.
All right.
Thanks, Rajesh.
Can everybody hear me?
RAJESH KASTURIRANGAN: Yes.
BRITTA VOSS: OK.
So my name is Britta Voss.
And I'm an MIT alum from 2014.
So I just wanted to
start off with a sort
of a brief overview
of our motivation
and the idea behind
Community Science.
And so we called this
From Community Science
to Community Action.
And that really gets at
sort of a larger motivation
for this series of seminars--
of taking science and putting
it to use for people.
And it gets at
the mission of MIT
as an institution of using
science for [INAUDIBLE]..
I just want to start off
really, really broad here
and ask the question of
what is science even for?
Why do we have science?
And my interpretation
of this is that humans
are naturally
curious, and we want
to understand the
world around us.
And we also have needs.
We need food and shelter
and transportation.
And so science has this
double purpose for humanity.
It feeds our curiosity, but it
also helps us solve problems.
And it gives us a
process and a framework
for addressing both
of those issues.
And we all know
that science is very
important in modern society.
It's pretty much everywhere
you go, from your smartphones
to social networking
to the systems
and the infrastructure to
make modern life possible.
And although we're all
very aware of that,
very few people have
a direct relationship
with science, either by doing
it themselves in their day
to day lives, or even the
other people in their lives.
And so another motivation
for this seminar series
is that we're looking
for an angle that
will help us make science
more relevant to people,
make people care
more about science,
because it has a very
important role in our society.
And despite how
important science is,
we all know that
science is often
misused and mischaracterized
at shockingly high levels
of our leadership.
When you see very prominent
people making comments
about scientifically
false issues,
and even to the point of someone
like a US senator bringing
a snowball to the
floor of the Senate
to prove that climate
change is false.
So there's obviously
a lot that needs
to be done to make
society more aware of
and informed about science.
And so how can community
science address that?
Community science
is a way of making
science speak for communities.
So climate change is probably
the best way, or the best
example, of an area of science
where community involvement
is critical.
Because here, for
example, you can
see the effects of climate
change on agriculture
affect people's livelihoods.
They affect the economy.
And people who depend on
agriculture, which is not just
all of us, the food
we eat, but people
who make their livelihood
off of farming,
need to understand how
climate change is affecting
agricultural productivity
from droughts and wildfires
and all sorts of issues.
And so science can
help them with that
if it is directly
addressing their needs.
And then just a
few more examples--
climate change is
affecting water temperature
in rivers which has effects
on migratory fish populations
that serve as important
cultural and economic basis
for Native American populations.
Nuisance flooding in
cities like Boston,
and especially in South Florida,
is becoming a big problem
for quality of life and economic
vitality in a lot of areas.
Communities in the
Arctic are literally
falling into the sea
in some places because
of thawing permafrost.
And then, of course,
in northern Alberta,
you have tar sands mining
that's wiping out the forests,
and it's dumping lots of
toxins into local rivers.
And these toxins are going
downstream and making
first nations communities sick.
And so by producing independent
science for those groups,
you can help them fight
back against these sorts
of environmental threats.
Communities living downwind
of coal-fired power plants
similarly are at
serious risk of things
like mercury contamination,
articulate aerosols,
and other negative
health effects.
And so if these communities
have tools of science,
they might be able to
file lawsuits or come
to their policy makers
and say that they need
a certain regulation or policy.
And another good
example, of course,
is fracking, where the local
communities, especially
if there's wastewater
injection going on,
might be at risk for ground
water contamination, irregular
seismic activity, and lots
of other environmental risks.
So the motivations
for community science
would be empowering
communities by giving them
ownership of their
data and responding
to their specific needs.
And with respect
to the scientists,
it increases public
awareness and interest
in science, which is
important to making sure
that science is still a
part of decision making
and that science is
supported long term.
And also importantly,
especially in the context
of climate sciences,
making sure that scientists
have access to local knowledge
that they might not otherwise
be aware of.
But of course, there's
also challenges.
So just like with
traditional science,
community science needs to
ensure the data is high quality
and that it can be used for the
purposes of the community that
needs it.
For instance, that it will
hold up in a court of law.
These projects need to
have long term support.
So if you're looking at a
long term monitoring program,
it can be hard to make sure
that that's financially
viable over the long term.
And then, the community's
needs might change over time,
if the environmental
threats change,
or if the make up of
the community changes.
And so research projects need
to be able to respond to that.
And I'm sure Jeff can
talk more about this--
just a few ideas
about what can make
community science successful.
A few key factors are making
data and methods open source
so they're freely available
and open for discussion.
And similarly,
open communication
between the community
members and the scientists
themselves to make sure that
everyone is working together
and not for their own purposes.
And probably the
most important thing
is making sure
that the community
is involved from
the planning stages
and not just
brought in later on.
And that's basically
the key difference
between community science
and what's usually
known as citizen science.
And finally, just in terms
of the appeal of community
science, it's important
to encourage creativity,
both from scientists
and community members.
And I think one of the really
important aspects of community
science is that it can
be a synthesis of tools
and ideas from different fields
that might not happen naturally
in traditional
scientific enterprise.
So just, finally, some
examples-- there's
probably many more out there.
But Public Lab, as
Rajesh mentioned,
you're going to hear from
Jeff about it pretty soon.
And then there's also
the EarthWorks Community
Empowerment Project, which gives
these forward-looking infrared
cameras to communities
that want to monitor
air balloons from local
operations such as fracking.
And then one I just learned
about is Safecast, which
is in response to the Fukushima
nuclear meltdown, which
is getting communities
scientific tools for monitoring
radiation contamination
in their communities.
So with that, I will
turn it over to Jeff
to tell you about some more
specific tools for community
science.
[APPLAUSE]
JEFF WARREN: You know, as Britta
said-- thank you, Britta--
I'm one of the
founders of Public Lab.
There were seven founders.
And I'll get into a
little bit about where
Public Lab came from.
But I really wanted
to talk about
what makes Public Lab
different, and what
that has to do with
some of the topics
that we're going to dive
into in this course.
I titled the talk
Renegotiating Expertise.
Because I think there's
kind of this moment
we're in now where
people are beginning
to be more aware of the
mechanisms of expertise
and where they are working and
where they need improvement.
And so I had some thoughts
on this, and these are--
yeah, they're preliminary.
So I'm eager for the
discussion portion
of the talk to just sort of dive
into some of these questions.
And also, they're not
necessarily right.
But I'm going to
put them out there.
And I'm eager to
hear your thoughts.
So Public Lab does what
we call community science.
And this involves supporting
community knowledge production,
which means creating
bridges and shared spaces
between formal expertise
and community needs.
So in the picture
above, you can see
a group that is on the Gowanus
Canal in Brooklyn, which
is a Superfund site.
It's heavily contaminated
with polyaromatic hydrocarbons
and raw sewage.
I think somewhere in the
order of 300 million gallons
of raw sewage go into
the canal every year.
And that's actually
part of how the New York
sanitary system works.
I don't think there are any
current plans to change that.
That's it, functioning properly.
But this picture is actually
the day after Hurricane Sandy.
And folks in the
Brooklyn sort of chapter,
or sort of local
group of Public Lab--
Public Lab's an open
community, so anyone can join--
went out in canoes, as they had
done many times before in part
of their monitoring
of the cleanup,
and took a bunch of remarkable
images of lots of stuff
having been washed
into the canal as well
as also some of the
infrastructure that's
been put in place,
like these booms,
to prevent pollution
from entering the canal.
This is next to what
is now a Whole Foods.
So this boom was actually
added in response
to previous monitoring by that
group of the construction site.
And I think folks
sometimes misunderstand
what Public Lab is.
Like a friend once
told me that it's great
that we're helping the
public to understand science.
And I think that is part of it.
But that's really not the core
function or the core purpose
of Public Lab.
I think Public Lab is
different because we
focus a lot on the
question of who above what.
We're not necessarily
teaching people about science
exclusively.
We're trying to negotiate a new
relationship between science
practice and the public, perhaps
a more equitable or mutually
beneficial relationship.
And that involves a lot
of obviously big issues.
But I think just
through our work,
and in trying to support
communities facing pollution,
the question of how
our expertise works,
how it functions,
comes up a great deal.
And that's something we've been
sort of receptive to coming
from communities
we've worked with
and tried to understand deeply.
These questions like
who builds knowledge?
Who is it for?
Who asks the questions?
And who understands the answers?
These are pretty deep questions.
And I doubt we'd be able to
unwrap them all in this session
today.
But they're pretty fundamental
to some of the issues
that we're going to
talk about later.
I think what's key
is that we're really
trying not only to seek to make
science findings accessible--
I think that is
important-- but also,
its methods, its tools, its
structure of participation,
and the depth of
participation that people
have in how science functions.
This means both making
more accessible on ramps
to make it--
I won't necessarily say easier--
but I think accessible is
a slightly different shade
than easier.
But it also means
challenging what's
possible in science practice
by leveraging things
like peer production,
open source
as Britta mentioned, and things
like the maker community, which
I think is changing our
understanding of what
technology development can
do and how it can function.
So just for some
concretes, you may
have heard of Public Lab's
balloon mapping project.
This is our oldest project.
And we developed this technique
with a number of communities
in the Gulf Coast to
monitor the BP oil spill,
to take aerial photographs
in very high resolution
of spill-affected sites before,
during, and after the spill.
And basically, you just
attach a camera to a balloon.
I mean, it's easy
to say it like that.
But there's a lot of
little things about it--
in how you connect things up
with string and rubber bands,
in how you archive the
data and interpret it.
And it really is this whole
embodied research project
in a community that
is primarily made up
of nonscientists or
nonprofessional scientists,
we'll say.
This is a good example.
There's a group, two
people in a canoe,
again on the Gowanus
Canal, this time
in the middle of the winter.
In the box-- and
it's hard to see
with the color on the
projector, but this
is a large plume of raw
sewage that's on the Canal.
As I mentioned, this
happens all the time.
So people who live
there are really
familiar with when and
where it happens, how often,
and what volumes.
And they've structured
their research project
based on their understanding,
their deep knowledge
of this particular site.
And one thing that
they discovered-- oh,
it's not in this picture--
later slide-- teaser.
So we also focus on
making your own tools.
Now I wouldn't say this is a
prerequisite or an absolutely
necessary portion of our work.
But it has been a really
important part of it.
We've managed to engage
pretty large numbers of people
in constructing tools
and experimental setups.
For example, paper
craft spectrometers--
optical range spectrometers
built around a webcam,
and doing comparative work using
different sample preparations,
and in some cases, ultraviolet
light to induce fluorescence.
And so this is just a graph of
how many people have actually
built and uploaded data
using a spectrometer
that they built themselves.
This graph is, I think,
the past 52 weeks.
But overall, almost 10,000
people, which I think
is an interesting
project for us.
So all in all, people
come to PublicLab.org,
they post their work to
share it with others,
but also to ask for help.
These people might
be scientists.
Many of them are.
But they're just as likely to
be educators, to be hobbyists.
And the group we're most
interested in serving
are those community
groups who experience
environmental
problems firsthand.
So I guess it's a big question.
Why do it yourself?
Why go beyond simply
dissemination of science
knowledge to the public?
And I think there's a bunch
of different reasons to this.
But this is sort of the crux.
In some ways, it's
because experts, I think,
often have a pretty narrow
conception of where the public
could become involved.
For example, public
dissemination of science
is part of most federal grants.
There's some portion
of it where you have
to communicate your findings.
This is an area that
people, I think,
are making good progress on.
But involvement in the
design of experiments,
in the formulation of
research questions,
in the interpretation and
application of those findings
to real world scenarios--
those are often
considered outside
the scope, sometimes
even of science practitioners,
but certainly outside the scope
of a partnership with
a community group
facing a challenge or a problem.
Of course, I think with
the do it yourself kits
and so forth, the cost barrier
is definitely a factor for us.
It's hard to get
more people involved
in performing science, doing
science, and understanding
science--
any of those-- unless there's
cheaper instrumentation.
This is not true for all
fields, but it's certainly
true for some.
But I think really to answer
this question more thoroughly,
I think we need to take
a few steps back and try
to better understand how
shared knowledge is produced--
the key word there being shared
knowledge, not just knowledge
that's held by
scientists, but knowledge
that is commonly held,
which I hope is the goal--
and how expertise works.
So a depressing slide, I know.
But this is The New York
Times' up shots sort
of meta poll of polls.
So they're listing all of the
projections of the outcome
of the November 8th election.
Obviously, the data
didn't fit the outcome.
But I do think it's
an interesting case.
In part because it
has a lot to do with--
in my eyes, it has
a lot to do with how
expertise is represented today
and how it's communicated.
How are our projections
or predictions made?
This isn't representative of
that many forms of science,
but I think it's a
relevant data point.
And specifically, why
and when people trust
these kinds of projections--
and I'm not necessarily
calling these wrong.
I think there's
something really--
I'll get into this
in a moment, sorry.
So data and its
interpretation increasingly
drives decision
making in our society.
And this is something
that happens a little bit
outside of the scope
of what we typically
understand as science
practice, but it
is an important ramification.
And I just want to suggest this.
I think you can see how this
might become a problem, not
in itself, but where it
displaces, where it happens
at the cost of a more discursive
mode of debate in a democracy.
And I really am not saying
that we should use democracy
to do science.
What I'm saying is that there is
a relationship between the two
that we need to
better understand.
And I think this could present
challenges not only because
of possible biases--
I mean, there's clear
problems with science
being paid for in
certain spheres as well
as ideological issues
and their relationship
with science in Congress
as was mentioned earlier.
But I think also it has to
do with some of the areas
that Public Lab is
focusing on, which
may be the most objective
parts of science--
the selection of problems
and questions to pursue,
and of course, the application
of science is findings.
These are sometimes
outside the scope, please.
AUDIENCE: I don't want
to derail us but--
JEFF WARREN: No, please.
AUDIENCE: Did you say that
data and its interpretation
increasingly drives decision
making in our society?
I think there's a common
belief that, sadly, opposite
is now true.
JEFF WARREN: Oh, timescales--
I mean, the last
200 or 300 years.
[LAUGHTER]
JEFF WARREN: Sorry.
Very-- yeah.
AUDIENCE: But there's--
I mean, one of the
reasons I'm here
is because I have great
concern that we've
lost this notion of truth and
falsehood in public discourse.
JEFF WARREN: Absolutely.
I desperately want
to talk about that.
I'm being a little round
about, so I apologize.
Yeah.
So I mean, as you
said, it's concerning
when people lose trust.
This is a graph of the 48
hours surrounding the election,
and the projections of
the election's outcome.
And it's a really depressing
graph to look at for me.
I found it really interesting.
This is The New
York Times upshot.
But I found it very
interesting the language
that fivethirtyeight.com
used, and a lot
of other data driven
analysts are increasingly
using, to tune how they
communicate certainty.
And this is something
where, in the days following
the election, you heard some
analysts talking about, well,
we said it was 70 something
percent or whatever.
And that's not-- that's
actually not very certain.
You know, there's something
hidden in that or something
that needs to be unwrapped about
the communication of certainty.
And I think it's
a real challenge.
I don't know that people
have answers to this,
but it's something
I'm interested in.
I know they sometimes would
say things, like, more probable
than making a field goal.
That didn't help me, because
I don't know anything
about football.
But they're trying
to communicate
what the graphs mean.
You know, it's easy to just
look and see all blue dots.
But it's a very different
thing to understand
what the ramifications are
for how reality plays out.
And then, of course, yeah--
this is the big thing.
That sort of scenario plays out
on a lot of other narratives,
right?
Adjacent displays and
communications of data--
many of you may have seen
this Bloomberg thing.
It's very interactive,
extremely data dense.
Like, there's so many
studies and so many data
points that have been summarized
and metasummarized to create
something which communicates,
I think, very effectively
about warming trends.
If you haven't used it,
go and play with it.
It's really, really interesting.
And so, you sort of have to
ask why isn't it persuasive
to everybody, you know?
Because it's pretty good.
And I think it's easy
to demonize experts
for not being good communicators
when things go wrong.
I think a lot of
complex knowledge
is communicated in pretty rich
and pretty interactive ways.
It's not just learn
this by rote, you know?
AUDIENCE: Is that the name of
the tool, compare and contrast?
JEFF WARREN: It's Bloomberg.com
What's Warming the World?
And I think it's pretty great.
So I think, yeah,
with such a wealth
of data and such persuasive
communication of that data,
with all the tools
we have today,
what is-- or is-- something
broken about expertise?
And I think that, in
some cases, people
are very much afraid that
there is something broken,
maybe not about all expertise,
but about some portions.
You have a thought?
AUDIENCE: --comment again.
I don't think
expertise is broken.
But I think there's a
feeling among experts
that no one has the patience
or wherewithal to listen.
JEFF WARREN: Yeah.
AUDIENCE: And when you
add that to the conflation
And obfuscation
of fact by people
who really are pure
advocates, and kind of have--
whatever the interest
may be, whether it's
to show up to their party,
whether it's to curry favor
for any position--
JEFF WARREN: Funding.
[LAUGHS] Yeah.
AUDIENCE: --that seems
to have overwhelmed
the voice of reason and
fact-- it's just my opinion.
JEFF WARREN: I agree with that.
I think the way that I'm
using the term expertise here
is potentially trying to
understand it in a wider scope.
Which is to say
expertise could be
defined as a body of
knowledge which is contained
or known or collected.
But what I mean, broke--
when I'm using
the term here, I'm
talking about it as a set
of relationships as well.
AUDIENCE: [INAUDIBLE].
JEFF WARREN: Expertise-- yeah.
And relationships with experts--
and who are experts,
how are they identified,
how do we trust what they say?
How do we, if we are
experts, communicate
in a trustful manner to people.
There's a whole set of issues.
AUDIENCE: The scientific
method was supposed
to be the solution to that.
But I think
everyone's just too--
JEFF WARREN: Well, let's
not give up on it yet.
AUDIENCE: But we can't
push a button [INAUDIBLE]..
JEFF WARREN: Yeah.
It's true.
AUDIENCE: --wayside.
Forgive me.
I'll [INAUDIBLE].
JEFF WARREN: No, no, no, please.
And thank you, no.
I'm glad you're engaging.
Because it's something I
think about a great deal
and have thought about,
especially recently.
So Harry Collins is not
popular in all fields.
But he does do a very close
and careful examination
of different kinds of expertise.
And I think it's a
very interesting thing
to think about
what distinguishes
different kinds of expertise.
And one in particular that he
talks about is meta expertise.
And it's the ability to
distinguish expertises,
the ability to compare and to
choose an expert among several
who are purporting
to be experts.
And he says, you
know, and I think
this is a persuasive
point of his,
that it's a particularly
difficult one,
but it's one which many people
are called upon to have.
It's one that is often based
on long term reputation.
It's based on, in some
cases, relationships,
personal relationships,
and it can sometimes
be affected by a different kind
of expertise, which he calls--
I think he calls it downward
discrimination expertise, which
is essentially the
ignoring of one expert
because you perceive
a different expert
to be of a greater authority.
So I don't know about every
observation he's made.
But I do appreciate the
taxonomy he's created
and the attempt
to understand what
are the mechanisms
that allow expertise
to occur in our society.
And I think the question for
Public Lab and for some of us
is what do we do about
the widening gap?
Because although
there is a tendency
to think that the ability
to question expertise
is driving a wedge,
the democratization
of knowledge production is
an assault on expertise.
But I actually think
maybe it's like there's
a few other dimensions to that.
And although I'm
not going to say
that's not true in
some ways, I think
that there are other ways we
can think about it as well.
So what Public
Lab tries to do is
to focus on problem definition.
So this is the earliest
stage in the sort of sequence
that might encompass
scientific inquiry.
And staying close to
real world problems--
Britta mentioned
communicating with people
as early as possible, building
products in collaboration
with groups that face problems,
engaging them in the problem
selection in the formulation of
questions, and in some cases,
in research design.
There are specific
expertises and capacities
to formulate an experiment.
But those may be, in
some cases, the places
where it's most likely that
you would learn something
from a group that has
deep understanding
of a particular problem,
first hand knowledge.
So I'm really interested
in that potential, and in,
really, collaborating as
much in asking questions
as in answering them.
But what are the
sources of mistrust?
I think there are
many, but I'm going
to try to dig into
a few of them.
I think one of them is limited
ability to evaluate or test.
So this affects,
perhaps, climate science
more than almost any
other type of science,
although I guess the
LHC is another example.
But how can people
evaluate empirically
what climate science is saying?
It's not very possible.
You can observationally
do it in some cases.
But understanding that in
a context is difficult.
And I mention this
one mainly because it
underlies a lot of what
we do at Public Lab.
Public Lab's not primarily
interested or not primarily
engaged in climate research.
We're primarily engaged
in pollution research.
But we take it as
a powerful thing
to be able to empirically
verify something.
And that's why we're
focused on low cost
tools and democratization
of the technologies.
But this is linked in
climate to the following--
when processes are too big to
see feedback loop personally.
When you go and
you do something,
it's one of the
longest feedback loops
that we are confronted
with in research.
But yeah, oh, sorry.
I already mentioned this.
But basically, we do
focus on testability
at Public Lab on the question--
can you also build this?
Do you also get the same result?
And this is a picture of one
of our spectrometer prototypes.
Someone literally, like, tweeted
a picture and a link to plans.
And someone else built one
and tweeted that they had,
as close as possible,
reproduced this.
Harry Collins talks a lot
about the infinite regress.
What's the-- anyway, whatever--
I'll get back to it later.
AUDIENCE: Sorry.
Is that name Harry Collins?
JEFF WARREN: Harry Collins?
Yeah.
Yeah.
I'll talk a little more
about him later, too.
I should probably start
moving a little faster.
A couple of others--
environmental issues
affect someone else.
I think this is one where
it's not just about--
I think there's many
sides to that one.
It's a tough one.
But I think
increasingly people are
understating
environmental problems
as ones which affect people.
That's a major step forward.
I think the environmental
movement had
been very closely associated
with conservation,
and I think
conservation is great.
But I do think it is
important for people
to recognize that there
are justice issues at stake
with communities that
are facing pollution
and don't have a way to
respond to it, or sometimes,
even, to understand it.
But increasingly, pollution
is affecting everybody,
and the climate is
affecting everybody.
And I think this is an
opportunity for common cause.
The other one is one that
affects poor communities
perhaps more than others.
And that's that they
have very limited ability
to respond, and in many
cases, to question.
And therefore, they
already have the experience
of having been lied to and hurt
by industries, and sometimes
by the scientists that
those industries employ.
I know this is a difficult
one for all of us.
But I think that if
you talk to communities
who face pollution
firsthand, this
is a very common experience.
And it's unfortunate.
Harry Collins actually
mentions that he
feels that the fact that we are
upset when we see that there
has been an exchange
of money which
has influenced the findings
of a research project--
we are upset because we
know that that's wrong.
Because there's something
essential and fundamental
about science which is being
broken when that happens--
so complex, but interesting.
So OK, so what can I do as a
scientist or a technologist?
These aren't the same
thing, but the question
might be relevant to both.
Tough-- we're going to
try to get into this.
I have some ideas,
four broad ideas.
This is an article which
I found very interesting.
It recaps a lot of ideas which
Public Lab has championed
over the last six years.
But it also shows
how difficult it
is to have an
articulate conversation
about these things,
because it is very complex.
The subtitle is maybe
more important--
experts need to
listen to the public.
I went into the
comments, all right?
I know that's not always
a productive place to find
things.
But for once, I
actually thought it
was really, really educational.
Yeah.
So I'll just read it.
"No, scientists need to do
science, not run a PR campaign
and become marketing experts.
They aren't trained to do that.
And it's silly to
expect them to.
What the rest of us
need to do is invest in
the school system"--
well, that's interesting--
what the rest of us--
so this gentleman does not
identify as a scientist--
--"is invest in the school
system that we've basically let
rot in many places so that
our citizenry has knowledge
of the scientific method
beyond the third grade level.
If they understand
what science is
and what it has
accomplished, then they'll
appreciate its value.
It's the job of
the public schools
to teach this, not
career scientists."
There's almost too much in that
statement for me to peel apart.
But we'll try to get to some
of these questions as we go.
And I'm not putting it up here
because I think this person is
completely wrong.
I'm putting it up here because
it's a series of statements
that have some value.
I think that it is
overlooking other things,
but the next two
are even better.
"This boils down to
wanting scientists
to basically add
some responsibilities
to the number of things
they have to do already,
yet it doesn't
seem to dangle much
in the way of tangible
money for that extra work."
True-- TLDR-- less
science, more photo ops.
I think that wasn't
a helpful comment.
But I think it's reductive in
a way that is helpful for us
as we're looking
at this problem.
So educate yourself.
That's what the first
commenter is saying.
But actually, I want to say
it to everybody, including
scientists and technologists.
I think it's really important.
Because we tend to think, and
we're taught science, often,
in the public schools
somewhat historically.
Where did it come from?
How long has it been around?
Why does it work this way?
And how did it develop
into what it is today?
I mean science studies--
MIT has a great department
of science, technology,
and society.
You know, basically,
I think it's important
not to be naive about this.
Understand how the field
works empirically as well
as theoretically.
As in, you know,
how do we aspire
for it to work versus
empirically, how can we measure
it to be working or not or in
what ways, who it's benefited
and how it developed over time.
This is one thing that
I really respect folks
like Harry Collins for
trying to understand, apart
from the different ways
that people have actually
come up with to understand it.
I mean, Harry Collins
is just one perspective.
Part of this, I
think, is vocabulary.
And just about this
particular topic
that Public Lab
is engaged in, you
might have seen three
different phrases.
You'd come across
these three phrases
to describe closely
related ideas.
Public Lab uses
community science.
It's a term that we've
helped to define.
In part, we've used it
because there's actually
two definitions of citizen
science, which are competing
and quite confusing.
There's the 1995, Alan Irwin's
definition of citizen science.
Rick Bonney describes
it as a methodology
for engaging a large
group of people outside
of science practice in
performing data collection.
For example, doing bird
counts, submitting data,
being an extension
of science's ability
to interrogate the world.
And this is a very
powerful thing
that I think that
Public Lab uses as well.
But actually, I think
Public Lab is perhaps
more inspired by the older
definition of citizen science
by Irwin.
And Irwin described the
work of HIV activists
in the '90s and earlier
who included AIDS patients,
and who were involved in
drug trials in early AIDS
treatments.
And they organized.
They protested.
They did die ins at the
National Institutes of Health.
And ultimately,
they gained what--
and again, I'm over
reference term by Collins--
interactional
expertise, which is
that they could read
and debate papers
and peer-reviewed research.
They could challenge the
structure of drug trials,
and they successfully
did so, persuading
those who ran the trials
to modify how they worked.
And in some cases, they did so
in an extremely disruptive way
to the researchers.
Which is to say they
sometimes exchanged
the drugs they
were given in order
to intentionally mix
placebos with nonplacebos
because they found it
to be unethical to do
double blind research on
people who are suffering.
So it's a complicated
story, many sides,
many, many important
aspects of this.
But what happened was not
that scientists, per se,
decided to include
people in their research,
but that they were
persuaded to do so.
And they eventually did so,
some of them, voluntarily.
And collaborated with
activists, in some cases,
in order to recruit
for new trials.
So there were constructive
collaborations that led out
of this sequence of events.
And it's a fascinating history.
It's a fascinating set
of new organizations
or new relationships between
people who did not originally
have almost any
kind of expertise
besides the immediate expertise
of being a victim or a patient
and people who had
expertise of the kind
that we are more familiar with.
So OK, fascinating,
and difficult
to distinguish the two now
that the terminology has
been overwritten.
So Harry Collins--
also Sandra Harding,
another controversial
figure, but one
who I really appreciate.
She wrote Whose Science
and Whose Knowledge?
And she talks about
the relationship
of feminist epistemology
with scientific research.
And she just has so much to say.
It's amazing.
But one thing that
I really appreciated
was her focus on the
selection of problematics,
the choosing of scientific
questions as an area which,
well, as she was writing in
the '80s, was understudied,
she felt. So she has a
lot to say about that.
Harry Collins has a book--
Are We All Scientific
Experts Now?
Spoiler alert, no.
[LAUGHS] Definitively,
he says no.
And I'm persuaded by
a lot of what he says,
but not by all of it.
Collins also did a
really interesting sort
of retrospective of
this set of studies
from after the
Chernobyl disaster.
He wrote a piece in the
early 2000s looking over
that work called The
Science of the Lambs.
He's part of a group of
scholars who are very punny.
But he looked at how the
studies of radiation's
effects on sheep in Cumbria
and other parts of the UK.
He worked with
Trevor Pinch as well.
Basically, it's, like,
it's complicated.
But he looked at how researchers
trying to map out and quantify
radiation did and did not
succeed in working with farmers
and building bridges between
the farmers' knowledge of water
flow, of exposure,
of site conditions,
and the farming practices,
and their own expertise
to rich conclusions.
Tough one, but a really
interesting read,
and a fairly short one?
So Sandra Harding,
as I mentioned,
who asked the questions which
science attempts to answer--
I think it's a
really important one.
I don't know.
I mean, I know, but
got to dig into that.
So OK, some tough ones here.
The possibility that
scientists' practice today does
have blind spots,
and specifically
when it comes to other forms
of knowledge production.
Not to say it's not
interested, but there
are new forms of
knowledge- well,
new-ish forms of knowledge
production emerging.
And I really want to be clear.
I'm absolutely not
saying we should try
to recognize climate denial.
No.
That's not the kind of blind
spot I'm talking about.
[LAUGHING] I think
that's really part
of a parallel discussion
about the influence of money
in politics and science.
And it's one I'm not even going
to try to broach necessarily
in this session.
I'm talking about
the lived experience
of those who suffer from
environmental problems.
And to some degree, this sort
of humble recognition of our own
limits and unknowns.
And especially on questions,
critical questions
of environmental harm.
So number two-- in terms of--
I wanted to tell the
story of this picture.
That's a sunken
boat, so ignore it.
This is-- I wrote it
over with letters, oops.
But there's a darker
thing here, right?
And that's actually melted ice
as water came out of a pipe
on the side of this canal.
And that wasn't on the
original engineering surveys.
And it wasn't in the
EPA's data on this site.
But it's an active inflow.
There's water and whatever
else coming out of it,
off of a construction site.
And it's just a good example.
There's a group who lives there.
They go by that site every day.
And they can do kinds
of observations.
This is data
collection in a way.
But they did it not because they
were contributing to science
in a sort of a noble way,
but because they're engaged
in the problem, you know?
And they're critically
monitoring this site.
They're watchdogging this site.
And they're trying to hold
the abutters, the construction
sites, and the potential
polluters-- they're
trying to hold their
feet to the flame.
They're not objective.
But they were able to
submit data, including
this photograph and
others, that updated
the understanding of the site
and influenced the cleanup.
OK, interesting, and
specifically, it's
easy to take for
granted when you're
speaking with your colleagues
where your expertise comes
from, what kind of certainty
you're communicating.
And this is something
that, you know,
when people read the so-called
climate gate e-mails, insider
talk is structured
in a certain way.
It's hard.
It's not designed to
communicate to all audiences.
But when you are
communicating with people,
especially outside of the group
that you work with immediately,
how do people know where
your expertise comes from?
I mean, you know,
I think titles,
degrees, credentials help here.
But they're not the whole story.
And there's this really
interesting sidebar.
The Quechua language
group in Peru
has this fascinating quality.
Which is that it has a
suffix which indicates
the source of your knowledge.
So you can say the same thing
and indicate grammatically
whether you heard it from
someone else, whether you
experienced it firsthand,
and several other forms
of empirical context.
And I often wish that I
did a better job at that.
Full disclosure, in
terms of communicating
how you get your
expertise, I'm not
a scholar of science studies.
Although I'm a fan of
it, as you can tell.
I also have no formal
science training.
I just have some thoughts.
I think interactional expertise
is what Collins talks about,
the ability to speak the
language of science, which
is on the way to being able to
perform at science, to actually
do science.
It's hard to develop.
He notes that AIDS
activists were able to do so
with a lot of hard work.
He also ran this
interesting experiment,
hard to know what to make of it.
But where he did a quiz
along with a number
of gravitational
wave researchers,
and then showed the
answers, his answers--
he studied the community
for over a decade.
But he doesn't do
gravitational wave science.
And actually, I think,
seven out of nine of a panel
were unable to
distinguish his answers
from those of practicing
gravitational wave scientists.
And that wasn't to say
that he thinks it's easy.
He did this for decades.
He worked with these
folks for decades
to acquire that level of
interactional expertise.
But what he's trying
to say is that there
is a fine line of distinction
between be able to communicate
and critique and interact with
people in a field of expertise
versus being able to design
and perform experiments.
And I don't think it's a
matter of dumbing things
down when we talk about
inviting other people into work.
I think, as the commenter
said, that scientists
aren't necessarily the best
at communicating knowledge.
But that doesn't mean
that they're off the hook
necessarily or that the
burden is on, exclusively,
everyone else.
I think that there has
to be some teamwork here.
I'm going to move
forward, because I think
we're running out of time here.
[TAPS PODIUM]
I did want to say--
let's see, OK.
I know that we often talk
about mass communication
and so forth.
But when outsider
groups are more
able to challenge
expertise, we're
living in an interesting time.
I think there are positive and
negative ramifications of this.
I think that the limitations
of science practice,
that capacity, budgets,
some of the things
these commenters very
clearly articulated,
the fact that science isn't
suited for every problem we
have on Earth, you know?
It's not the end all, and it
cannot contain all knowledge.
But I do think that there are
alliances that may be formed.
I mentioned the maker
community, the hacker community.
But also, environmental
justice groups
who have worked for
decades to do science,
but to do it to answer
questions about threats
to their own health, to find
relationships between knowledge
production and justice,
social justice,
and who have been doing their
own monitoring and watch
dogging, often with
very good relationships
with the researchers who
choose to work with them.
Yeah, again, I
think you can look
at groups who use aerial
photography as Public Lab does
or Google Street View to
investigate pollution issues.
There are more empirical
means at our disposal today.
And I wanted to mention, sort
of wrapping things up here,
that Public Lab's participating
in the Environmental Data
Governance Initiative.
So Public Lab began
as an effort to create
an independent record of the
BP spill, a separate data set.
But with the
transition happening,
the presidential
transition, EDGI
is an effort to
download and archive
EPA data before the
transition potentially
cuts off access
or destroys data,
as actually has happened
in previous presidential
transitions.
It's sort of a
breakneck effort that's
been put together
over the past 10 weeks
to literally, like, scrape
and download everything
that the government
has online currently.
Anyway, I mostly-- you
know, I don't have any where
near all the answers here.
But I'm trying to
ask hard questions
and propose ways forward.
I really am trying to find
places to build bridges
and to build alliances
and not walls.
And I think that
getting closer to people
personally, getting to
know people personally
who are outside of
your particular circle,
is really powerful.
To learn what people
know, what they need,
even if you don't always
agree, and primarily
to not assume that information
flows only one direction.
So OK, I'll put the
hardest question up.
Is bad science, like science
that doesn't serve the public,
or that is misleading,
is it science gone wrong
or is it science as usual?
Is there something
fundamental about the way
that we're doing
science today that
needs to be reformed
in some way,
or are there a
number of bad actors
who are taking
advantage of science?
And really, those are sort of
two sides of the same coin.
I mean, in the sense that
if there are bad actors,
we could reform science
to try to stop them.
And Collins says that--
well, I mentioned this sort
of idea of when we abhor--
when there are bad actors,
when we can recognize
when it is going wrong.
And then, really, these
are things that I'm
sure people have thought about.
But you know, is science more
inclusive as a profession?
Is it more inclusive
in its conclusions?
And I guess is the broader
direction of science,
and specifically, its questions
more than its answers,
simply what we make of it?
And I'm very clear
in that distinction.
Because I don't mean that its
answers are what we make of it.
But I do mean that we can
choose to pursue inquiry
in different directions.
And we can choose to
structure what we ask,
even if we can't choose to
structure what we find out.
Thank you.
[APPLAUSE]
AUDIENCE: I've been involved
with data collection
in numerous ways that have
been citizen called for.
One was with lead in
the soil, and the other
was monitoring the
river out here.
But now I see a really,
really important area
for citizens in our
communities in having
the skill to, you
might say, cross
examine the experts, especially
around infrastructure projects.
And I'm thinking
of the gas projects
in Massachusetts, where
there've been a lot of hearings
and the scientists,
or I would say,
the utility representatives
have a lot of expertise.
JEFF WARREN: Yeah.
AUDIENCE: And so there
you have a great deal
of information on their part.
And then you have these limited
opportunities for citizens
to raise their hand,
like, wait a minute,
aren't we getting too
overdependent on gas.
And what we don't have
is equivalent ability
to question the basis for
how do you make decisions
about these things and
being able to influence
the decisions that are made.
So I see a gap there with
whatever community ability we
can --
We need help in that vein.
JEFF WARREN: Yeah.
There's certainly
an asymmetry to it.
And it's very
difficult to have--
I mean, for example,
self-reporting is
a common mechanism in
terms of regulations
for producing knowledge
about emissions
or about potential pollution.
But self-reporting is
not blind, you know?
It's telling people
what you did.
And often, like in
Louisiana, a lot
of, say, smokestack emissions
are based on estimates.
They're not even actually
based on empirical measurements
that you perform yourself
as an operator of a gas
facility or a refinery.
So it's very alarming, because
the standard of evidence
is almost meaningless.
It's like, I think
we probably, maybe,
emitted this much
lead last night.
We're next to a community,
like a residential community.
So it's very troubling.
I mean, part of
this is asymmetry is
cost as well as expertise.
And I think the equipment to
measure gas, if it's cheaper,
it makes a lot of this easier.
But it's not the whole
equation, for sure.
One example I wanted to share
actually that I forgot was--
so a lot of the
community groups we've
worked with in places
affected by oil and gas
will grab sample measurements.
So they take a bucket,
and they use a vacuum,
and they suck air into a
gas bag inside the bucket.
And then they mail
that entire bucket
to a lab to get a
certified test done
of analysis of the contents.
And what is nice about this is
that it allows these community
groups to choose, based
on their deep knowledge
of the patterns-- like, do
the facilities typically emit
at night?
Do they emit at certain times,
certain days of the week?
Are there signals?
Like, is there a flare up that
you see, or is there an alarm,
stuff like that
that enables them
to structure when they take the
samples in order to sort of,
like, catch the emission
at the right moment.
What's nice about it also is
that it's a standardized test.
So they can send it
to different labs.
They're sort of, in a
sense, made it a service
that these labs provide,
as opposed to the people
collecting the sample being
the service part of it.
So it's sort of inverting
that in a nice way.
And the thing that was
really remarkable to me
is that several of these
communities that we've
spoken with and worked with
will actually not trust labs--
one of them didn't
trust in state labs.
And one of them just
doesn't trust a lot of labs
in general, because
they feel that there
may be some of these
labs do work for
and accept money from
oil and gas companies.
And so what they did, which
was really remarkable to me,
is they faked samples.
They made positive and
negative control samples.
And then they sent
those to labs at a cost
of hundreds of
dollars per sample,
in order to test the labs.
And only after confirming that
these labs would correctly
report different levels of
preprepared positive and
negative samples, did
they then use that lab
for their own real sampling.
And you can imagine that
if community groups are
already under resourced and
find it difficult to marshal
their resources to
do testing at all,
it's not trivial to spend all
that money to establish trust.
But I thought it was a really
interesting example of science
being done on
scientists, you know?
And I think it's a positive--
it's actually a positive thing,
you know?
