[ Music ]
>> Will: Our first
speaker is Dave Stoney
with Stoney Forensics
in Virginia.
And the title of
his presentation is
"Time to Rethink Dust."
[ Pause ]
>> David Stoney:
Well thank you Will.
This is a work that
I'll be talking
about that's got an aspect of
reconsidering things from sort
of a theoretical or maybe
a very hopeful perspective
and also an aspect that
introduces work - research work
that we are now doing
with the assistance
of the National Institute
of Justice.
The traditional focus
for forensic particle
trace evidence is
on comparative analyses
and specific target particle
types -- fibers, glass, paint --
you know, as opposed to
looking at all of the particles
that are present in
the sense of trying
to exploit all the
particles that are present.
That is a good, correct,
necessary, appropriate approach
as far as I am concerned.
But one thing it has done is
it's narrowed our perspective
in some respects.
And I want to sort of push the
envelope in that direction.
So my motivation is to rethink
this using all the dust that's
present as opposed to - well,
in addition to the
targeting materials
that we are looking at.
We have this problem that we
have been chasing for years.
The traces are from mass
produced manufactured materials.
And we see in the work
that's being done on glass,
you are pushing this to the
limit in a beautiful way
of milking everything you can
get out of this sort of thing.
But you still have a fundamental
limitation to this --
to class associations, what -
to highly discriminative levels,
but still class associations.
And there is these
issues relating to trying
to make a judgment of the
strength of the association,
particularly to quantitate it.
And in 1996 at what I consider
the first one of these symposia,
I gave a presentation and Ed
Vartik gave a presentation.
And we were looking at
the use of databases,
and we were contrasting what
you can do in trace evidence
with what the folks were doing
in serology - I guess
it was DNA.
The - and - in look - in trying
to create a database that's
relevant to the interpretation
of your evidence, one of the
points I made is, "Well look,
you need a standard
method first.
Because if you are going to
have things in the database,
you have to standardize
your methodology
in order to use that.
And then very importantly,
you need to decide what's
the relevant population" --
fine for something like
individuals and human beings
and blood, but very complex
when you go to trace evidence.
And then our focus is on
extremely rare events,
at least the way we
traditionally approach forensic
trace evidence.
We want to look for methods
that are highly discriminating.
And that gets us into
this syndrome of trying
to not only use highly
discriminating methods
on one type of particle,
but also then predict
how frequently it occurs.
And that's been something that's
been on my mind for some time.
I decided I'd name this:
the individuality uncertainty
principle in forensic science.
The smaller the frequency,
the larger the population
we need to estimate it.
Our population is small,
with uncertain heterogeneous
composition.
And we can't test or
reliably predict frequencies
of those rare events.
So it's thrown out
there as a principle,
something to be debated.
And maybe it's true
and maybe it's not,
but I believe it's a
fundamental problem for us.
I stated this slightly
differently in 1991.
I said that, "Our provable
probabilities will be much,
much more common than
either our good science
or common sense will allow."
And it translates
to this conundrum
with decreasing reliability
of frequency estimates
with increasing evidential
value.
I strongly suspect that
as things move forward
in decision making theory and
in the forensic statistics area,
that some of these rather
simplistic views will probably
be modified.
But I propose that
as a principle
that we can at least
talk around.
More motivations
to rethink dust:
changes in forensic
science practice.
Two very important ones:
technical progress;
computer-assisted
analytical methods;
and data processing
capabilities.
We have seen changes
in our daily lives
relating to these things.
There are changes in our
instrumentation that we use
for trace evidence analysis.
It's naive of us not to
think that the future
of trace evidence
analysis is going
to very significantly be
changed by the amount of data
that can be processed and
the amount of processes
that can be automated.
Claude foreshowed this in 2007.
He asked the question,
"When will we go to -
ever reach the next level
in forensic science
trace evidence?"
There is also professional
changes.
And these - I debated whether to
call this regression or changes;
I stuck with changes:
standardization of methods
and routine analyses being
things that are expected of us;
increased specialization;
reduction of subjectivity --
gosh knows what that's going
to do to visual microscopy
when that hits; accreditations
and certifications; and pressure
to get more scientific, or more
like other scientific
disciplines.
And we have greater
community interest.
You know, scientists,
legal community,
and public are paying
attention to us now.
So there is also some clues,
though, to guide a new approach
that at least make me
want to rethink dust
but also say, "Hey, look.
There is something we
could maybe do here."
We have got these -
reiterating a little bit
that the interpretations issues
that might guide a new approach.
We have got a limitation
of class associations.
The second one: case-specific
systematic variations
that can't be controlled.
What I mean by that is, when
you try to interpret your -
or quantitatively interpret the
significance of your evidence
in the case context,
what might be,
"This bit of trace evidence
occurs with a frequently
of about 1 in 200,"
then you get things,
"Well what about in
this neighborhood?
What about at this
particular type of case?
What about a transfer
of 50 of those fibers?
Was -- " We get case-specific
things
which make it more
complicated all the time.
And then we have this
individuality uncertainty
principle that I named.
Along with that, you have
got compellingly strong
evidential value.
When you see the cases like
David Floor presented, you know,
that's - you don't need
statistics for this stuff.
My gosh, you got multiple
transfer evidence.
It's as compelling as it
could possibly be in terms
of decision making and proof.
And you get cases with
many-layered paints, whether
or not correlated for
any reason at all.
So we have the ability to get to
extremely high probative value.
And - we are not limited
to class association.
Well wait a minute: We are
when we look at one particle.
But when we look at many
particles, we get there.
We get there the same
way that other folks do.
With soil analysis
there's another clue.
We've got issues and
approaches that arise
from combinations
of small particles.
Now some of the variation
that we see there would be -
deterministically would
be controlled by one thing
or another, would be
highly correlated.
So if I have got a soil that
has a particular bedrock,
it's going to have particular
types of minerals that are all -
ought to occur in about
the right - certain ratios
or more restrictive
ratios of abundance.
On the other hand, you have
got things coming in there
from the garden and
the human intervention,
the anthropomorphic particles.
So those are a stochastic
process.
They have got no reason - they
can't - they have to be viewed
as being uncorrelated.
One more set - clue to guides:
We have got this DNA analysis.
We have accepted
theory and methodology
to calculate joint
probabilities.
They are not getting
to it by proving
that there is one
rare substance.
Eventually they might if
they sequence a whole genome.
But they are getting
it from a set
of modestly rare occurrences.
And they can put reliable
bounds on the frequently
that those occur and the
correlation between them.
So multiple transfers of a set
of moderately rare particles
could do the same thing for us,
and they do all the time.
We are already doing it.
But they can break the
barrier of class association.
And they can address this
individuality uncertainty
principle as - that we can
now measure their frequencies
of occurrence and
their correlations.
So where do we get
those sets of particles?
It's fine in that
case that comes
with the multiple
transfer evidence
and the multilayered paint chip.
But the reality is the
particles are there.
We are just used to looking
at this small set of them
because those are the ones
that, darn it, can mean the most
in a case where you can say,
"These are the target fibers.
These are the ones
that have this source."
But these other particles
are there.
They are everywhere.
They are very small particles,
so let's name them,
too: the VSPs.
We don't usually use them.
Sometimes we do.
We are mostly focused on
larger conventional traces.
Exception: GSR.
We have got some
experience in this area.
Exception: DNA.
Those are real small traces.
So here's the potential
and the thing
that got me excited once I
started thinking about it.
You know, look at
those particles there.
That's a mineral grain on the
left and a fiber on the right.
There they are.
That's what I want to get at.
Those are my particles
that I want to say, "Hey,
can I - maybe not use them.
Particular source, I know where
those small particles are.
But why can't I use those
as multiple transfer
evidence in all the cases?"
So I want to use these
fine piggyback particles.
They are on the surface of
traditional trace evidence.
And why not use those to
test whether those fibers
and their subsequent exposure
while we were wearing them
after we had bought them --
whether that can be used to test
that class association further?
And then every case becomes
a multiple transfer case.
So the potential
that I see in this:
There is extensive
air monitoring
and environmental health
experience in this area.
You have got the
study of respirable
or near-respirable dusts.
You have got frequencies
of occurrence
that people are studying these,
not from a forensic
science perspective.
And there's plenty
that we'd say, "Ah,
that doesn't directly apply."
But darn it, this
is going on already.
There is local monitoring.
There is studies of what dusts
are present in what city.
There are studies of what
dusts are in the household.
A lot of that's focused on
things that aren't going
to be directly adaptable,
but it is going on.
We could use them maybe,
for instance, to put a bound
on a moderately rare particle.
Tracing of airborne
pollutants to their source,
automated analysis methods
-- this is all going on.
There is also specific forensic
experience in this area, in GSR.
There is people in this room
who have been doing this
for 22 years, which - Brown has
been already doing this stuff.
Dave Exline's been developing
instrumentation for it.
The folks at McCrone
Associates have been doing this
for longer still -- and
the people who have gone
through that organization
and gone elsewhere.
But I believe it's of
revolutionary significance
that when we are working with
complex particle mixtures,
co-occuring particles can
be used to independently
and quantitatively
test alternative
attribution hypotheses.
And we can achieve high
levels of individuality
that can't be reached
through single-particle
frequency estimates.
Now it's different
from a couple of things
that might look very similar.
It's different from looking
for specific target particles
that are based on
the case context,
regardless of their
size, like GSR.
Right? It's different from that.
It's different than monitoring
for specific particle types,
like the applications looking
at these fine particles
in environmental
hazards and in pollutants
and in security threats.
It's different from tracing
the source of pollutants,
when you are looking at a
particular target and trying
to figure where it comes from.
And it's different from
determining what's happening
at a given site -- so monitoring
a particular type of effluent
in order to say what
might be going on.
So what is it?
Again, I thought I'd name it:
Particle combination analysis
is the best I could come
up with -- so PCA.
We want to use co-occuring
particles to independently
and quantitatively
test alternative
attribution hypotheses.
So we thought we'd test
this approach, or try to.
And we were - greatly appreciate
the funding that's allowing us
to do this.
Carpet fiber would seem
ideal for this purpose:
long-term exposures
in one place,
very large exposed surface area
on the surface of a fiber --
a type of forensic science
evidence that's very mature,
and we all have ways of
dealing with it already.
They are designed to trap small
particles in carpet fibers.
And indoor environments
are highly variable --
as the Petracos, the folks
at MVA, have been studying
and documenting this
for a goodly while.
So testing the appropriate
carpet fibers started.
We have developed methods
to wash these things
off and clean them.
We have got unwashed
fibers there on the top
and the washed ones there.
Andy Bowen developed these
methods in our laboratory.
They are not particularly
profound,
but they are well
tested at this point.
This slide'll be in
the set here to -
for reference if you need it.
But here's an example
of some of the blanks --
the reagent blanks and the
process blanks in the top
and the samples on the bottom
of particles recovered.
Now they are ready to send off
to a computer-controlled
SEM analysis.
And you are familiar with
those - the type of data
that you can get from that.
And it's not ideal
data for single -
it's not reasonable data for
single-particle comparisons
in forensic science when you
are looking at individuality
as opposed to identification.
But it's suitable for
something, and if it can get us
over this barrier and get us
to be able to test the origins
of those carpet fibers,
that's really nice.
So the research that's currently
underway, we are looking
at within and in
between item variability.
If I take this carpet and
I go to different places,
it's going to be
just the same kind
of issue with soil, I suspect.
You know, I don't know
how close I have to get,
but we are testing that.
So in this area of the
carpet, multiple samples,
let's find out whether these
particles are really there
and whether they
are really useful.
Well they are really there.
But how useful are they?
We have nylon carpet fibers that
we have standardized or settled
on in household environments
and automotive environments.
And we are testing how likely
is a measured particle profile
to have originated at
a randomly selected -
as a randomly selected
profile from a given population
and multinomial distribution,
et cetera.
So anyway, stay tuned.
We have got the particles.
We have got the data.
We have got the CCM work done.
David Exline has done that work
for us, Andy's done the work
on the isolation of the
particles, the preparation.
And he'll be assisting
me more than he'd
like with the statistics.
And the project was supported
by NIJ, so thank you very much.
I appreciate their --
[ Applause ]
At the back of this talk, if
- when you see the slides,
is a series of references
that are dealing with some
of the issues and covering
the environmental aspects of -
types of studies that are being
done perhaps you might not be
aware of with the - in the
environmental literature.
Thank you.
