>>SMITH: I figure you guys are like my students
and you like to have things like this.
I don't like to hand those out because as
soon as you do, people start leaving ahead
and ruining all the surprises.
You should get a notes version of the PowerPoint
and a back-to-back list of references.
Anytime there's a reference on the slide,
I've given you the actual reference so you
can go.
I'm sure you'll want to go out tonight and
read all of them, find out exactly where I
messed up.
I see a lot of people with computers.
I have no objection to you having a copy of
this PowerPoint if you really, really are
desperate and want it.
My PhD is in experimental psychology.
I taught for 26 years in a small college in
Arkansas and I taught every single student,
every one of our majors the statistics course,
the research course, and the experimental
psych course.
So I'm pretty vested in this topic.
I thought it would be a good topic to talk
about.
The first part of the title comes from where?
>>WORKSHOP PARTICIPANT: Shakespeare.
>>SMITH: Shakespeare.
Good, good.
Hamlet.
I think a lot of our students think that we're
mad when we want to talk about research methods.
But I think it's one of the most important
things that we talk about in the whole course.
I don't really think I'm going to have to
try and convince you of that point, but I'm
going to maybe try to convince you a little
bit and maybe give you some things that you
can do to maybe slip it in a little bit.
A good friend Bill Buskist, a quote that I
really like, "I'm sure your students are just
like mine.
They are younger but they don't come in expecting
to hear science and psychology.
And if they walk out still thinking that,
I think we've not done what we should do."
So not only convinced them that psychology
is a science but try and get them to think
somewhat like scientists.
And one of the interesting things I have found
out in my college teaching career is that
depending on what college you went to and
how it's labeled, the hard sciences, the soft
sciences, the natural sciences, the social
sciences or whatever, the natural sciences
don't really teach a lot of research methods
or statistics.
I taught statistics to a lot of biology students
and nursing students and so on and so forth
because they said they don't teach those kinds
of things.
Some of you I know teach AP, and this is what
the AP psychology course guidelines look like.
Do any of you ever show your students this?
>> WORKSHOP PARTICIPANT: Yes.
>>SMITH: Okay, because they're waiting for
the abnormal and the personality and the therapy,
and just letting them know what some big body
has decided this course ought to be.
The research and the bio, they're usually
not really thrilled about.
And if you actually go and you look at the
psychology course description, study skills
happen to mind and critical thinking skills.
I'm talking about research methods, but I'm
going to focus on awful lot today on critical
thinking because as you're going to see throughout
this, I think it's really hard to pull those
two things apart when it comes to psychology.
And I doubt that you can find very many colleges
anywhere where people say don't say we teach
critical thinking.
In 2007, APA put out a document of APA guidelines
for an undergraduate psychology major.
You may or may not have seen this.
Here they are right here.
Yeah?
>> WORKSHOP PARTICIPANT: It's not [unclear]?
>>SMITH: No, let's go and pass them out now.
So this is a 
little bit like the AP psychology course guidelines
but it's aimed at the entire psychology major.
And what you'll see somewhere is this list.
Actually, there are 10 guidelines and these
5 are the ones that are specific to psychology.
There are 5 more like maybe critical -- library
that we're not tasked with doing ourselves,
but we're supposed to help develop along with
other disciplines.
And of course, you see in there research methods
and you see in there critical thinking.
Again, I don't think it's any accident that
both of those are in there.
I don't think there's any accident that they're
side-by-side.
This is just pulled from that booklet.
If you look at Goal 3 which is the critical
thinking in psychology, I see research methods
woven all throughout critical thinking: respect
and use critical and creative thinking (skip
to it), inquiring when possible the scientific
approach to solve problems.
And if you think about it, that's one way
that our discipline thinks critically.
We don't run around willy-nilly, haphazardly
guessing at things.
It doesn't stop people on the Internet from
doing that, but that's not what we're supposed
to be doing.
And if you dig further into that, there are
learning outcomes for each goal: use critical
thinking effectively, engage in creative,
use reasoning to recognize, develop, defend,
and criticize arguments and other persuasive
appeals, approach problems effectively.
It doesn't necessarily say research methods
in there but some of those things are I think
very related.
And then when you look underneath the learning
outcomes, sub-goals, that's the best term
I could come up for them.
They don't label what those are.
But, again, we're in the critical thinking,
the overall critical thinking goal but look
at the sub-goals that are listed there: evaluate
quality of information.
It doesn't have to be research method but
research method is certainly one good way
to get there.
Evaluate behavioral claims.
How do you find out if what people put out
on the Internet is true?
Here's the latest greatest craze to do this,
that or the other thing.
How do we know?
Challenge claims.
Whenever I see something or hear something
on TV, Internet, the first thing I say, where
are the data?
Show me some data to back up these claims
you're making.
Use scientific principles and evidence.
Evaluate popular media reports.
That's always an interesting task for students
to look at what they said in some popular
outlet, and then go look at the original research.
It's kind of scary how well they do.
And these are the science writers usually
writing these things.
How do you weigh because I think that's one
thing we're bombarded with.
People say A is right and other people say
B is right.
And sometimes both can be right, but sometimes
A means not B and B means not A. When they're
both saying we've got data, how do you figure
that out?
Evidence?
I talk sometimes about politics.
Certainly, we're not teaching political science,
but you want a wonderful example of people
making all sorts of claims without any evidence
whatsoever.
There you go with politics.
And it doesn't matter which side you favor.
They're both guilty or they're all guilty.
I guess there's more than two.
Evaluate the quality of solutions.
There are multiple ways to attack a problem.
How do we decide which one is going to be
best?
So again, these are all under critical thinking,
but I think every single one of those fits
very well with research methods.
And I think that's a tool that we have, that
we can give students that will serve them
well not only in our class but serve them
well in their lifetime.
Someday, they're going to be parents and they're
going to be having to weigh these various
claims.
Baby Einstein, there's a zillion of those
out there.
You can go on the Internet and find a whole
handful.
And those are good things to show students
and talk about these kinds of issues with
the data.
Remember the Mozart effect?
That was all the rage until the research came
out and said, well, it doesn't really work.
But that's kind of like when you've been arrested
for pandering and your arrest is on the front
page, and then you get exonerated and that's
buried in the small print somewhere.
So the big splashes get out there, but when
the data come in and don't always support
them, we don't know always hear as much about
it.
If you've got questions or comments or whatever,
weigh in, okay?
I've got a concluding slide in here three
different times; it's the same concluding
slide.
I don't know where I'll be when time runs
out.
Part of it may depend upon you guys.
So if you've got questions or something to
say, go ahead.
It's only fair, I think, to let you know where
I'm coming from.
And I already told you this.
I see these as being so intricately linked
that it's almost impossible to separate them.
When I agreed to do this talk, I said, "Yeah,
I'll talk about research methods."
And then I came up with a catchy title, and
I said cool.
That's one thing.
If anyone ever asks you to do a talk, it's
really easy to say yes.
Then at some point, the day of reckoning comes.
So when I started putting this all together,
I just kept coming back over and over and
over again to critical thinking.
I said, why?
Well, I think because of this.
If a student could take in college an entire
course in research methods and come out and
not score high on a critical thinking test,
I'd be really, really disappointed.
I'd be disappointed if they can talk about
it a week or two.
I don't know how much time you give to research
methods, a week or two weeks even and come
out not being a more critical thinker.
So back to my original topic, teaching research
methods can help students to think critically.
And again, I think in the course that's a
given.
I've taught these things long enough to know
students didn't sign up to learn about research
methods.
Students didn't want to sign up to learn about
the history of psychology, the biological
basis of behavior.
But if we go back to that AP guideline, two
of those things were in the top 8 to 10 percent.
So we know they're important.
I'm trying to be charitable here - sometimes.
And I don't really mean to pick on students
because I think all of us sometimes avoid
critical thinking.
I've got a slide to talk about that later.
So here's what I hope to give you some ideas
about today: a stealth approach to teaching
research methods and critical thinking.
Presumably, you will have talked about the
research methods.
And then later on when you get to the stuff
they are really taking the course for, you
can come back and talk about research methods,
not necessarily as blatantly and directly
as you did when you were on that chapter or
that unit, sort of sneak it in after the fact.
I was department chair at a college in Georgia
for four years, and one of the things I got
the department to think about was teaching
research methods across the curriculum.
We talked about writing across the curriculum
and things like that.
I said research methods across the curriculum,
one of the things, and it's true in an intro
psych course, it's true for psychology majors,
they're very good at compartmentalizing.
This is that stuff I have to learn so I can
get to this good stuff.
And if you ever ask them, "Why did you have
to learn that stuff over here to get to the
good stuff," I'm not sure they can answer.
And if they don't see the connection, then
I think we've fallen short of where we ought
to be.
So if we bring research methods back in the
fun stuff, the good stuff and let them see
why those research methods are critical to
knowing what we know in those other chapters.
And so that was my point to the faculty in
that department.
When you're teaching an abnormal psych course
or you're teaching a personality theories
course, okay, yeah, you're getting to teach
the stuff that they are really interested
in and they think is fun, but if you don't
let them know where that information came
from, they're not really seeing a connection.
And you can look (I've done this) at a curriculum
survey of colleges and universities in the
United States.
The number one and number two courses that
are required of all psychology majors are
stat and research, far above anything else,
way above anything else.
So what does the literature say about critical
thinking or critical thinkers?
I've got about three slides here of various
traits.
You can accurately explain their decisions
because you've thought through it.
You've had to critically evaluate things.
Therefore, you're able to explain why.
Consider alternative explanations.
How quickly do people jump at the first explanation
that seems to make sense and then they're
blind to any other possibilities?
A lot of science has gone that way, right?
You jump on a bandwagon and go halfway down
the lane before someone says well, maybe we
should look at a different approach.
Your students ever react emotionally to things
you talk about?
As soon as you get emotional, you lose that
reasoning power, that critical thinking power.
I think that's why, at least when I was kid,
I always heard you never talked about religion,
politics, or sex.
Determining the truth and falsity of assumptions
means you question assumptions in the first
place.
A lot of us (again, I don't want to pick on
the students) have a lot of assumptions that
we just sort of grew up with.
I taught at a religiously affiliated school
for over a quarter century, and I would talk
to them about why do you go to church?
And a lot of people go to church because they
grew up going to church.
Their parents said it's time to get up and
go to church, and they just went.
And then you look at the literature on cults
and those middle class kids, 18, 16, 20 are
prime victims for cults because they've never
really had to evaluate their values system,
their values structure.
They've just grown into these behaviors.
They're just habits.
They're not really values that they have critically
thought about.
And talking to people in a religious school
about critically thinking about your religious
values can get kind of dicey.
I left on my own accord.
They didn't fire me.
[Laughs]
Students and people are really good at this
if you mark out the word reason.
We can argue until the cows come home, but
do we have reason behind our arguments?
This is the big bugaboo.
I'm old enough that when I was in college
and you went to the library and you looked
up books and you looked up journals and you
looked psych abstracts in paper.
Computers existed somewhere in the Pentagon
or somewhere.
But the information that we and that our students
are confronted with is just incredible.
And I don't understand why -- not everybody
has this but some people, oh, it's on the
Internet, it must be true.
There's a lot of stuff out there that is not,
of course.
Opinion, common sense, anecdotes, appeals
to authority, sounds familiar?
Any of your students ever resort to that?
Any of your friends, any of your colleagues
ever resort to that?
No one in this room, of course.
Opinions from fact, data from multiple sources,
it's really hard to convince people that you
need to look in more than one place to find
out.
I teach research methods to what should be
juniors in college.
Their big task in that course is to do a literature
review.
And they'll find an article and say, "Well,
this is it.
This tells me everything.
It's the end of the line."
I said, "Well, you know, that article was
done in 2006, and I bet there have been articles
since then about that."
But it's very tempting to find one source
that, of course, supports what you believe.
Then your literature search is finished because
they believe the same thing I do.
Preconception is about important issues.
The next slide is going to be my version of
critical thinking.
I hope before you leave that you'll end up
getting one of these books.
This is Challenging Your Preconceptions: Thinking
Critically about Psychology with some Randolph
Smith as the author.
So this actually comes from my book.
The person who put this together looked at
a bunch of different issues.
But I think if we don't know where we're coming
from, if we don't think about and own up to
where we're coming from, we probably are deluding
ourselves.
We talk a lot in research about the need to
avoid being biased.
That's a very difficult thing to do.
I always tell my students, and I wish I could
find the reference to this (I need to spend
some time and track it down).
But there's study out there somewhere and
it's pretty old that went back and looked
at people's datasets and redid statistics
and things like that and looked for errors
in statistical calculations because everybody
can make an error in statistical calculations.
And what they did when they found these errors
was to categorize them as to whether the error
favored the researcher's hypothesis or went
against the researcher's hypothesis.
Now, if statistical errors are random, half
ought to favor and half ought to disfavor.
Well, what they found out was two-thirds of
the errors favored the researcher's hypothesis.
Now, were they cheating?
Were they being blatantly unethical?
We hope not, but it is very difficult to really
be truly unbiased and know all those preconceptions
that you're holding.
And then that leads into understanding the
use and abuse of statistical information.
This comes from Buskist and Irons.
Again, you've got these references.
A lot of these references that I'm using,
especially the 2008 ones, come from this book,
which is Teaching Critical Thinking in Psychology:
A Handbook of Best Practices.
There's an annual conference that the Society
for the Teaching of Psychology puts on called
Best Practices.
And one year, it was about Critical Thinking.
There are some really good articles in there.
So notice how many of those traits resonate
with knowledge, skills, aptitudes, and values
gained through research training.
That's one thing we don't talk about a lot,
the values that we're trying to inculcate
when we talk about research.
But if you go back and you look at those,
a lot of those are values, aren't they?
I value my ability to do these things even
sometimes when it's stupid stuff that doesn't
matter one iota like a television commercial.
But when you have your research methods/critical
thinking hat on, you find yourself (or I find
myself) yelling at the TV.
That is stupid.
Come on.
There wasn't really a study that showed that.
It's all these claims that people throw out
there.
>> WORKSHOP PARTICIPANT: Or the morning talk
show host.
>>SMITH: Yes.
Oh gosh, yes.
They are some of the worst.
Your students have a lot of those traits or
values that were on that list?
And again, not picking on students, I don't
think we live in a society that values critical
thinking a whole lot, which is kind of scary
given everything that is going on around us.
But I think these are not necessarily behaviors
and traits and values that come naturally.
I think they're things we have to work at.
I've worked at it long enough that usually
it comes through, but there are times when
I fall prey to something.
If you get this book - or you will, right?
It'll ship overnight from California on Friday.
Apparently, overnight doesn't mean Monday.
You can say it might be Tuesday or Wednesday.
But inside here are characteristics of critical
thinkers.
One thing about critical thinking is it's
not always comfortable.
I can go into class and I can talk about this
long theoretical argument between this position
and this position, and sometimes the students
will just be sitting there.
Why aren't you taking notes?
Well, I want to know which one is right (which
one is on the test) but which theory is right?
We don't know which theory is right necessarily.
And sometimes, we come in to a really interesting
situation where two theories are right.
I can come up with one example, color vision.
You've got the trichromatic and the opponent
process, and it certainly makes sense that
one of those is going to be right and one
of those is going to be wrong, but it turns
out they're both right.
Just at different places in the nervous system.
So I think that's an interesting place to
pull in something about research and critical
thinking to your students.
Tell them if you're sitting there looking
and thinking the whole word is black and white,
right and wrong, that's not always going to
work out.
The inherent biases and assumptions, that's
our preconceptions that we're carrying with
us.
When I'm up here in this area of the country,
I talk funny.
In Texas, I talk normal and other people talk
funny.
Skepticism, this is one I think that sometimes
you got to worry about a little bit because
some people who've taken this one and they've
turned it into their mantra.
They don't believe in anything.
They're skeptical about everything, but I
would say a healthy air of skepticism.
Asking people to provide you with evidence
or data I don't think is going overboard.
Some of them may.
Facts from opinions, people argue politics
like it's facts, don't they?
There are some facts in politics.
They get obscured by politicians.
Again, this goes back I think to that black
and white, right or wrong sort of view of
the world.
A lot of people with tunnel vision, don't
bother me with all those other sorts of things.
Teaching people how to make inferences and
have those inferences be logical.
That's a hard job.
I don't think we can do it in one course,
but we can certainly try to model it for them.
At least introduce them to it; surely, they've
seen it before.
But I'm not sure and I'm picking on students;
I'm going to say I'm not.
That's who we're responsible for.
Do they understand what logic really is?
Is jumping to a conclusion logical?
It depends on the data.
It depends on the evidence.
Examine the evidences out there before you
draw conclusions.
That seems pretty obvious, pretty self-evident,
but it doesn't mean that that's what people
always do.
Now, notice how many of that list also fit
knowledge, skills, aptitudes, and values gained
through research.
Someone asked me to write this critical thinking
book to go along with an intro psych book,
and I wrote what I think is like a research
methods version of critical thinking.
Same conclusion, different list of traits.
And values, I come back to values.
I'm sure you guys are really big into assessment
in your jobs, right?
Everybody has to assess.
When you start talking about assessment way
back when it first started to come, it was
like assessment of knowledge.
What do they know?
But now, it's gone beyond what you know, what
can you do?
What do you have the potential to do, and
what do you value?
Because the scary thing about knowledge is
how quickly it can dissipate.
There are studies out there that looked at
retention of material, a few months, semester
or whatever, after students have taken an
intro psych course in college.
And it doesn't take them long to get down
to the level of knowledge of someone off the
street.
I hate to say this, in a sense you can argue
that might be okay because knowledge does
change over time.
Facts change.
I can't give you a specific example, but I
know as long as I've been teaching, I was
teaching things way back when that were facts
then that aren't now.
Do you remember McConnell's planaria study
where he trained the planaria to turn one
way and he ground them up and fed them to
the naïve planaria?
Lo and behold, they learned more quickly.
We're talking about some sort of chemical,
molecular or whatever basis of memory.
And I know that was in one of the intro books
I taught from very early on.
Well, what happened over time?
Nobody could replicate it.
And McConnell's planaria study, I still use
it as an example when I teach research methods
of how knowledge changes over time.
So here's a big idea maybe.
There we go.
And this is what I meant few minutes ago when
I said I don't think I always critically think.
If you think of state versus trait, personality
theories, anxiety for some people is a trait.
If you're an anxious person, you tend to be
more anxious than normal people all the time;
then, you've got that personality trait of
anxiety.
But things happen to you in your life that
raises your level of anxiety temporarily,
like giving a talk in front of a bunch of
people.
That's a state.
That's a temporary sort of thing.
Although I'd like to think I'm a critical
thinker, I'm going to lapse in and out of
it.
I have to be consciously aware.
I have to try and intentionally do these sorts
of things.
Now, I hope my state is more frequent than
my students.
But as soon as you start thinking of yourself
as I'm always a critical thinker, something
is going to happen.
Bob Cialdini has a book called Influence.
There is a, quote, "scientific version" of
it and a popular press version of it.
I may have read both of them, I don't remember.
But Cialdini is one of the world's foremost
leading authorities on influence.
And to write this book, he actually went through
car salesman school or training or whatever.
I hate car salesmen.
It's like dealing with yourself because they
know so many tricks.
But in writing about influence, Cialdini has
done a lot of research on foot-in-the-door
and door-in-the-face and all that sort of
thing.
And he talks about those and then he says
how he's walking down the street and this
cub scout or boy scout comes up to him and
wants to sell him some chocolate bars or tickets
to something else.
He says, "I bought a couple of chocolate bars
for two dollars.
I hate chocolate."
He walks away and he says, "I just fell victim
to this."
So the fact that we can critically think does
not always make us critical thinkers.
And we need to keep our shields or our radars
or whatever up to make sure that we're not
taken in.
But I think if we approach it this way, too,
it gives us a way to work with our students
to say let's try to move the state to be more
frequent rather than less frequent.
This slide came from my good friend Jane Halonen
who's got a chapter in that critical thinking
book.
And she says, "By knocking students cognitively
off-balance, they'll engage in critical thinking
to restore their balance."
It sounds like one of those old balance sorts
of theories.
And she gave an example.
You mentioned the morning show host.
She had her TV on one morning before she was
going to school to teach, and the people were
talking about (I don't know if this is the
proper term for it, I've never heard this
term) sagging, wearing your pants at half
mast or lower than half mast.
>> WORKSHOP PARTICIPANT: Below the mast?
>>SMITH: Yes.
And the newscaster or the person said, "The
lower the pants, the lower the IQ."
Now, if you're like me that sure fits with
my preconceptions.
But she brought that in to class.
And I'm guessing with college students, you've
got some people in there who engage in that
practice and certainly wouldn't react well
to that statement.
Jane has got several books and things on critical
thinking.
Trying to get them to critically think, let's
not react to this emotionally which if you
were a sagger, you probably would.
Or if you are anti-sagging, you're yeah, yeah,
that's right.
But let's think about this.
And could you turn that into a research project?
Certainly, I'm assuming, we wouldn't think
that pulling your pants down has a cause and
effect relationship, would we?
The way that was worded, it almost sounds
causative, doesn't it, as opposed to correlational?
When I teach my students in research methods,
they've come through a statistics course usually
the semester right before and we get to correlation.
I say, "Well, what do you remember about correlation?"
Correlation does not imply causation.
They can spout that until the cows come home,
but can they see it in the real world?
If you can't apply that, you're not really
thinking critically.
You're not really using that research sort
of thing.
So there are things that come out all the
time in the newspaper, on TV, on the Internet
that you could probably use to knock students
cognitively off balance, and then do some
good role modeling after that.
Don't get angry about this or don't react
in that way.
What could we do with that statement from
a research point of view?
I mentioned several slides ago about a stealth
approach.
I'm going to give you some very specific concrete
examples.
Some you might be familiar with even.
If you want to use this one, there are a lot
of disclaimers in here which I'll try to make
clear.
So if you look up attachment therapy, of course
one of the things you might find is Bowlby's
attachment theory.
Remember how kids are securely attached or
that sort of thing?
This is not that.
This is something called attachment therapy
which is used to treat attachment disorders
typically with foster kids or adopted children
because they may not be attached to the adoptive
parents, the caregivers or whoever is there.
And allegedly, they got suppressed rage and
that prevents them from becoming attached.
The suppressed rage is due to their past maltreatment,
abandonment, and all that sort of thing.
Now, this is a good example and I'm not at
all necessarily picking on these ideas at
this point.
This is a good example of how almost everybody
who develops a theory couches it in terms,
in words, and in thinking that seems very
logical.
I mean this seems very logical that this could
happen.
There's actually a DSM diagnosis coming up
in a few minutes.
It's not like this is totally made up or anything.
How do you deal with these attachment disorders?
Well, the most common approach is called holding
therapy.
The therapist or the parents firmly hold or
lay on the child.
And the idea is to get some of the suppressed
rage to come out and to achieve catharsis
which interestingly enough is an example I'll
give you later about a concept that everybody
knows.
I mean I think a lot of your students know
catharsis before they even come to class.
They may not know the word, but they know
if they scream and yell or break things or
whatever it makes them feel better.
The interesting thing about catharsis is the
data do not actually support catharsis.
So we've got a questionable concept inside,
packaged inside something that at least has
the potential for problems.
So the goal is to reduce the child to this
infantile state and then you can re-parent
the child, and so you're going to rock, cradle,
bottle feed, force the kid to look at you.
Again, the goal is to get the kid to bond
with you if you're the foster parent, the
adoptive parent or whatever.
So that's the context for this and, again,
to promote attachment.
It all sounds very logical.
If you look, you won't find any scientific
validation for this idea.
It's not based on attachment theory.
As I said, it's not Bowlby's ideas.
It's potentially abusive.
It has been labeled as pseudoscientific.
And a case study (actually there are multiple
case studies) this is the one with which I
was familiar at least to some extent.
Candace Newmaker, who was removed from her
home as a child, her parents had their rights
terminated at age five.
She was adopted at seven.
And within months, her adoptive mother took
her to see a psychiatrist because of her behavior
and attitude at home.
She wasn't shaping up.
She wasn't showing affection and those sorts
of things.
This is real.
This is in DSM IV-TR, reactive attachment
disorder.
One of the things about any of these kinds
of things we're talking about, there's usually
grains of truth scattered throughout them.
This is not totally off the wall.
So her mother took her to Colorado when she
was 10 to undergo a two-week intensive attachment
therapy session.
She was referred by a licensed psychologist
in North Carolina.
Okay.
So she's been seen by a psychiatrist and she's
been referred by a licensed psychologist,
presumably not quacks.
Okay, re-birthing is kind of like attachment,
theory attachment therapy.
You got to be careful because there is apparently
a fairly legitimate re-birthing set of ideas.
This one was not.
They wrapped her in sheet to simulate being
in the womb, and she's told to get out of
this to be born, the re-birthing, so she can
be reduced to that infantile state and attach
with the mother.
Okay.
However, the therapist (two therapists), they
call them adoptive parent surrogates or something,
and the stepmother (birth is not an easy task)
so they made this birth a difficult thing
by preventing her from getting out of the
flannel blanket very easily.
This was all taped.
Okay.
So people have seen the tapes.
You can tell this is getting into a desperate
situation.
Clearly, this was going on a long period of
time.
She's had her fill of this.
Yeah, it was a big deal on the news at the
time, somewhere around 2000, I think if I
remember right.
Now, what happened?
As I said, there were two therapists.
They both got 16-year prison sentences.
Therapeutic foster parents, that's the term
I couldn't remember.
Ten years of probation, a thousand hours of
community service; the adoptive mother, four
years of probation, Colorado and North Carolina.
The event took place in Colorado.
She came from North Carolina.
They passed laws, labeled with her name as
we often do with abused kids and things like
that.
And I think it was CSI and Law and Order or
something ended up having episodes about this.
Okay.
Now, sorry for the downer, but what are the
lessons?
Okay.
As I said, it's scientifically unvalidated.
How many people are out there offering therapy
that might not seem like this, but remember
those two slides of stuff that seemed kind
of logical about attachment therapy?
It didn't seem like absolute hokum.
Yeah.
>> WORKSHOP PARTICIPANT: What's missing here
is I'm guessing there must be some sort of
desperation from the parents that make it
really hard to divorce yourself from this
[unclear] desperate as a child.
And so I might get it before because as a
parent myself, sometimes it's like, gee.
I mean not all these things.
You know what?
This doesn't seem to hurt and you just do
this.
I mean this is obviously pretty extreme, but
that's an ethical --
>>SMITH: It is a very difficult call.
There's no question about that.
But I think what I would say or what I would
hope I would say is: where are the data that
support this as a scientific approach?
The example I use a lot with my students is
when you go to your family doctor, you don't
expect him or her to say, oh, I got these
new pills over here.
I don't know if they're any good, there's
no data.
You don't expect that stuff.
But people seem a little bit more willing
when we talk about therapy to start getting
away from what I would like to think of as
more mainstream.
And yeah, there were probably no data before
this that said this was going to kill someone.
And they're going to give you testimonials.
Oh yeah, this brought this kid together with
the parents.
And I know the vast majority of people out
there wouldn't know to ask about where are
your clinical double-blinded trials kinds
of things.
But I think more people would know that about
medicine, wouldn't they?
Yeah, critical thinking to me would be apply
the same standards to this that you would
to your medical doctor.
Yeah.
>> WORKSHOP PARTICIPANT: I think it's even
more important, the critical thinking piece,
because often you don't know the correct questions
to ask.
So in critical thinking, we keep saying, "I've
got to ask something" as opposed to, "I don't
know what to ask, so I'm just going to go
with it."
And that's where people get hung up both in
the medical field and in the psychology field
is that I don't know what to ask, so I'm just
going to assume that the experts know what
they're doing.
>>SMITH: Yeah.
One of the things I've seen in a few intro
text (I don't know how widespread it is) they
have little boxes or little things at the
end of the chapters.
I've seen some where I want to say it's at
the end of the therapy chapter, it might be
the stress and coping chapter, but what kinds
of questions should you ask a therapist because
people don't have a clue?
I think for a lot of people, going to a therapist
is like going to a witch doctor.
They're going to chant over you and rub some
bones and sacrifice a chicken and maybe you'll
be better.
Yeah.
>> WORKSHOP PARTICIPANT: I think as teachers,
and I'm going to say this honestly, in over
33 years, I've seen this happen.
That is a definite area where we are subjected
to "scientific," but unvalidated research
on how to teach.
And seriously, now they're like --
>>SMITH: Oh yeah.
How many workshops have you sat through with,
you know?
>> WORKSHOP PARTICIPANT: Yeah.
And you know what?
One of the biggest problems that psychologists
have with some of the educational research
is just that.
It's not scientifically validated.
It hasn't been proven.
And one area that every teacher in this room
knows about that psychologists will say is
not validated is learning styles.
>>SMITH: Yup.
>> WORKSHOP PARTICIPANT: No, I know that.
We went with it for a while.
>>SMITH: Oh yeah.
>> WORKSHOP PARTICIPANT: QBD [phonetic], then
learning styles, and now we're on to --
>> WORKSHOP PARTICIPANT: And all of the material
that they hand out at these, they all have
the byline and the date next to it.
So many of us critical thinkers look at it
and say, "Oh, it's got a record so it must
be right."
But it's the authored [sounds like] program.
>>SMITH: Right.
>> WORKSHOP PARTICIPANT: And that's how you
need to be --
>> WORKSHOP PARTICIPANT: You really need to
be really careful about that.
>>SMITH: I don't know how you do your in-service
workshops and stuff like that.
But there are people who make very good livings
going around the country presenting stuff
like that.
Well, I'm not going to say they don't care
whether it's validated or not.
>> WORKSHOP PARTICIPANT: They don't care,
it's just -- then the point.
>>SMITH: Yeah.
They believe in it, therefore, it must be
true.
It's that preconception sort of thing.
Try convincing your school counselor or your
principal or whatever that learning styles,
the data don't really support it.
>> WORKSHOP PARTICIPANT: But it's also a good
lesson for us, the difference between qualitative
and quantitative data because you know, psychologists
tend to like quantitative data more.
We have different kinds of research that we
do have to -- research and education is very
different than research in psychology and
it's not that it's all bad.
There are good things that people have learned
about relative to learning styles and teaching
styles, so it made them better teachers.
It's not all bad, but it's something that
we need to be aware of.
>>SMITH: Yeah, there's certainly a lot to
be said for not standing up and just talking
for 50 solid minutes.
So it's scientifically unvalidated, there's
no supporting research.
And this is the key.
This is the key.
The fact that there are no data does not lessen
people's belief in these things.
It's like arguing with that door.
That's a big problem for us.
And it's personal experience, trumps scientific
evidence which I'm going to get to maybe if
I have time - an absence of critical thinking.
Again, remember go back to those first two
slides, it sounds very logical.
That's one of the problems.
When things sound really stupid, it's easy
to critically think and to dismiss them.
But when things seemed logical and as you
said, if a parent is desperate -- I remember
those desperate nights and day, you know,
yeah.
Where are the proofs?
What are the supporting data?
I don't think those are ever bad questions
to ask.
I think those are bad questions to go unasked.
If you want to volunteer for a clinical trial
in medicine, you get informed consent.
You get everything out the wazoo because they
can't give you the proof.
They can't give you the supporting data.
They've got some hints, some indications from
the lab or whatever, and this works with the
animals or whatever.
But in medicine, you're fully informed when
you're not getting something they know will
work.
Okay.
This is what I was referring to, Smith and
Vasquez out of the critical thinking book,
talk about the persuasive power of personal
experience.
And you see this all the time in teaching
intro psych, don't you?
Oh, well, my grandmother or my -- yeah that,
all these examples of things and they don't
fit exactly.
And when you tell students we're talking about
general principles, they kind of get disappointed
because, oh this isn't going to work 100 percent
of the time in all people under all circumstances
every time.
No, it's not.
But you know, one of the things we're starting
to find out, too, is some of those things
we thought were absolute laws in natural science
aren't always absolute laws.
I mean, look every time, well, I guess they
don't send up the Space Shuttle anymore.
But every time they send up the Space Shuttle,
they send up experiments to see how these
things would -- some fifth or sixth graders
send up crickets to see how they would -- and
spider webs and weightlessness and things
like that.
There are conditions under which natural laws
don't always hold.
So why is the burden of proof on us really
higher than it should be for other disciplines?
Remember Halonen's big idea, critical thinking
could be a state, not a trait.
So when you lapse into your personal experience,
you're out of the state probably of critical
thinking.
And students are pretty good at learning lessons
when we make them really concrete and really
applied.
One of the examples that Smith and Vasquez
talked about is when you teach statistics
and you go over a new statistical idea or
formula or something and you give students
homework, they go home and they reliably,
religiously apply that formula to those problems
and they work them pretty well.
Oh, I understand this stuff.
And then you give them a test where all this
stuff mixed together and it's not clear.
Oh, these problems went with day 14.
All of a sudden, it's like I don't have a
clue of what's going on anymore.
So sometimes, we learn things only in a given
context.
And maybe I critically think about what my
doctor tells me, or maybe I critically think
about this or that or the other, but I don't
do it in other contexts.
Now, in terms of thinking about critical thinking
as a state, it needs to be a state.
I don't think you want to critically think
about every decision you make in your life.
I do when I go to the store and I see the
generic here for a dollar less; I pick it
up and I compare the ingredients to the named
brand, I see they're the same.
I put the name brand back.
I don't know if that qualifies as critical
thinking, but I'm not going sit there and
debate out of all the different toothpastes
which one am I going to choose, or every time
I buy a bath soap or et cetera, et cetera.
So some context is okay.
When I go buy a car, I probably better use
some critical thinking.
I don't want to be like the 16-year-old girl.
Oh that car goes great with my eyes.
Sorry for offending any 16-year-old girls
in the audience.
Carey and Smith wrote about a common sense
epistemology.
And the idea that they were trying to get
across here was that this common sense epistemology
arises from our sensory experiences.
So knowledge is simply a collection of many
true beliefs.
The problem with that is that the beliefs
are then not organized into any sort of framework
that we might think of as theories or collections
of ideas or things such as that.
And therefore, their personal belief obviously
is part of their common sense epistemology
and there's no need to check or evaluate their
personal beliefs.
That's one of the things we run into in teaching
psych.
We're not dealing with a lot of concepts.
I mean some of them, certainly are, but a
lot of things we talk about are things that
students do have personal experience with.
And if all we do is teach students to get
the right answer on tests, and then revert
back to their common sense beliefs, we haven't
really changed them.
But it's hard to undo this.
It's really hard to undo it.
If we were teaching nuclear physics on the
other hand, not of many of our students will
walk in and knowing a lot about nuclear physics.
Our students know a lot about behavior.
Unfortunately, a lot of what they know is
wrong or it's based on an anecdote or my grandmother
and things like that.
Scott Lilienfeld has done a lot of writing.
He doesn't really always label it critical
thinking, but he does a lot of pseudoscience
sort of things.
He is a clinical psychologist and that's going
to be relevant in just a minute.
He talks about naïve realism which is the
erroneous belief the world is exactly as we
see it.
Again, my personal experience, my personal
journey, trumps everything else.
So sometimes we tell students things and they
just don't believe us because it doesn't match
their experience.
So they may be pretty big on this seeing is
believing or what you see is what you get.
And Lilienfeld and his colleagues applied
this to psychotherapy which gets a little
bit back to that notion of attachment therapy.
And this is not necessarily our students,
but he teaches at Emory or Georgia Tech, I
can't remember, and he's training clinical
graduate students.
There's a bias that therapy is going to work,
therapy is going to be helpful or at least
it's going to be harmless.
What's the worst you can do, not get any benefit?
Well, as we saw with Candace, that's not the
worst you can do.
So if psychotherapy works is part of your
preconception, your mindset, why do we need
to bother with research?
Can't we just see people getting better and
validate it that way?
If we even think of the need to validate.
And so he very specifically points about trainees
and he's talking about clinical trainees.
And that's the notion of I see them getting
better.
And you go through your errors in thinking,
we're talking confirmation bias out the wazoo.
They gave a really good list of 10 reasons
why ineffective psychotherapies might seem
to work.
Initial misdiagnosis, maybe someone was experiencing
just an episode or an episodic version of
this but was diagnosed as chronic.
You think of depression that might be linked
to one specific event as opposed to being
clinically depressed all the time.
There's a big difference there.
The person who is going through an episode
is certainly going to show improvement pretty
quickly.
But if you thought it was not episodic but
chronic, something that's ineffective could
actually seem to work.
Spontaneous remission, we know that people
get better on their own.
Any good clinical study will have a waitlisted
control group.
So there's a group of people who say I need
therapy.
And they say we're full right now.
We can't see you for six weeks because we
need to see what happens to people who don't
receive therapy for six weeks.
And your students will often say, well, that's
cruel.
That's mean.
That's withholding treatment.
In clinical trials, aren't there placebo groups?
There has to be a control group or else we're
not doing good research.
And the assumption that we're being cruel
means we're assuming the therapy is going
to work which is our question, we should be
trying to ask, is the therapy going to work?
It becomes this cycle, regression in the main,
nice statistical term, right?
When do people call?
When do people say I need therapy?
When they're at the very worst?
What's going to happen?
They're going to get better just because it
can't get worse, which might not be a therapy
of course.
Multiple treatment interference, this is a
classic.
If we're doing research, we manipulate only
one variable at a time, right?
So if the person is seeing you for therapy
and they're taking vitamins and holistic this
and that and the other, who knows what actually
works?
Selective attrition, people do drop out.
The people who tend to drop out more according
to the clinical literature are the more impaired
people.
So in other words, the better people stay
and therefore more likely to get better anyway.
Placebo effect, this was something I didn't
know.
I mean I know about placebo effect, but according
to Lilienfeld and colleagues, 40 to 60 percent
of people report feeling better between the
phone call and the first session, just the
fact that they're going to go for therapy
starts some sort of process going.
Obviously, that's a placebo.
Novelty effect, something is going on, it's
new.
It's different.
I'm feeling better.
Demand characteristics just like in experimental
research, clients will report what they think
the therapist wants to hear especially if
you feel a bond, a connection.
Effort justification, kind of like cognitive
dissonance: I'm sitting all the time; I'm
taking off work; I'm doing all these things
with therapy, so it must right, plus feeling
better.
How typical [sounds like] of me to think [unclear].
Retrospective rewriting, there's no real change
but the aversion of how bad they were when
they first saw you making them change.
This is another example, liking themselves
depends when you talk about therapy, how do
we know therapy works because I should hope
students [unclear].
And here are some reasons why [unclear].
So as I pointed out again and again, what
we're seeing is research kinds of problems
more than anything else.
And perhaps not [unclear] because a great
therapist, you become great in what you're
doing helps.
Some mixed news here: this is a study where
they looked at undergraduates majoring in
social science and natural science and the
humanities.
They gave them a test at the first and the
last semesters of college so [unclear].
They measured them on some problems, statistical
and methodological with some involvement of
conditional logic.
And the good news for statistical and methodological
reasoning, social science majors and psychology
majors showed very large percentage compared
to natural science and humanities students.
Not surprisingly, what they found was there
was a high correlation in the [unclear] scores
with the numbers [unclear], and not surprisingly
the MD students might not know exactly what
I mentioned earlier, natural science.
[Unclear]
Not so good news on conditional reasoning
(and this barely shows) it's actually a negative.
But what they found here was the correlation
was with the number of math courses, math
and computer science.
When they looked at statistics in computer
science, it was essentially down than math.
So math was the [unclear] to do the work.
So you know, I say all the time in my department
now that those students, I hate to see them
in my office as department chair, [unclear]
change their major to science.
Other students are going to say, I really
don't want math and science but I want to
be a science major.
[Unclear]
>> WORKSHOP PARTICIPANT: As a result of that
in your department, have they required some
math and science as a result of that?
>>SMITH: We, in our bachelor's science degree
which is [unclear] semesters, they have to
take two semesters of sciences and they have
to take more math than the university requires.
Our SAT [sounds like] course, student [unclear]
councils and university math course, we don't
allow them to count that.
They have to take two math courses plus [unclear].
>> WORKSHOP PARTICIPANT: What about the [cross-talking]?
>>SMITH: We're trying to work on this problem.
We will not have math.
The BA is --
>> WORKSHOP PARTICIPANT: It's available in
math.
>>SMITH: With one science, but they still
have those two math courses because they can
take our SAT course.
They still have to take our SAT course.
They have to have college algebra [unclear].
So even the non-psych majors, they have the
course [unclear].
Now, the interesting thing was that claiming
this that link statistical methodological
reasoning to ability to critical thinking.
One is this stress about the logics of -- single
logic [unclear].
This is my first concluding slide.
I mean I better to conclude.
I think this one is very self-evident, but
you want to avoid poor thinking [unclear].
Critical thinking is absolutely necessary.
It's not just absolutely necessary [unclear].
Again, one of my things I think psych majors
should have.
It helps a lot whether you're on grad school
science or not, with their MBA [unclear].
Research method is a concrete way I think
to [unclear].
Critical thinking is a tough thing to teach.
You have to sit in front of me and say I'm
going to teach you how to think critically.
Where would you start?
So concrete research methods concepts are
not things that students typically stumble
over and say, oh, I don't understand that.
So I can't wrap my head around that sort of
thing.
They don't like it.
They don't think the material is all that
hard.
When you explain [unclear] of how we don't
know if men and women are different because
of heredity or environment, that's not why
science is understanding it, but that's a
pretty -- they don't understand that and then
think they got it in bigger terms than in
other contexts.
I think it's pretty important.
So I think if [unclear] and they don't like
it, I'm shortchanging you [unclear].
We're pretty much out of time.
Craig, [unclear]?
>>MALE: We do have to rest.
>>SMITH: To help you with your methodological
things [unclear].
