So I'll give you a little of the history of
my journey in research ethics.
Define what research misconduct is and then
give you basically a series of vignettes and
my opinion on where the field is heading.
And actually it's hard to say that scientific
misconduct is a field, but it's become a bit
of a cottage industry of folks doing this
professionally.
I've co-written with a bioethicist in the
NIH before.
That's all this guy does is miscount and ethics
sorts of things.
When I was a PhD student, somehow and I still
can't remember exactly how or why, but I got
appointed as the chief justice of our graduate
honor system at Virginia Tech. that means
I chaired panels that heard misconduct allegations.
Then I promptly forgot about all that stuff
until, it's been a number of years ago now,
when my own student, one of my own students,
a former student because I asked him to move
on after that, ended up plagiarizing in a
class and actually plagiarizing some of my
own work.
So 
between 1993 and I'm going to say 2006-7 I
was quite insulated.
This is a glass insulator.
I rode my bicycle here and as I was getting
ready to go out the door, one of my students
said here's a little a present for you because
I know you like these things and it come from
New Mexico and I thought, wow, I don't have
one from New Mexico.
So I rode with it in my backpack and I figure
I might as well display it because it was
kind of heavy.
But I felt pretty insulated from misconduct
and when it hit close to home, I started asking
questions.
Am I educating, am I talking about research
ethics and misconduct enough to my own lab
folks?
Because he told me that he thought it was
that he didn't understand really why this
plagiarism was an issue.
He thought it was a that he was complimenting
me by copying my stuff.
And so I did that, started teaching a class
and I'll talk about the class just a little
bit and ended up writing a book to support
the class.
And so here we are today.
And I think about this, I think about this
a lot.
Am I okay with the standing in the right place?
Would it be better if I was over here?
Ok and I'll use my pointer.
Here.
So basically, though, the Office of Research
Integrity has three sanctionable offenses.
Plagiarism, falsification and fabrication.
And these are some good definitions of these
terms.
So most everybody has a good idea about what
plagiarism is.
At least you would think so.
Since I've been doing a lot of editorial stuff,
and I, you know, recently confronted, this
is another kind of hearkening back to 2006
whenever it was about my student, you know
I got the paper and there was some plagiarism
where they were copying and pasting some of
their early works, right?
So it was self-plagiarism.
Once again, it was a group in another country
I think it was some place in Europe and they
said we didn't know there was anything wrong
with this.
And so once again, the education part.
And this is probably, and I think the data
support, this is the, plagiarism is the most
frequent sanctionable offense reported.
Then there's falsification.
Falsification is sorry, from the bottom here,
falsification is manipulating research materials
or research subjects...so this can be pretty
broad, right?
So it's basically not telling the truth about
a study.
Okay, that's falsification.
Fabrication is probably the worst, right?
Most people think that's the worst.
Making up data and saying something happened
when in fact it didn't happen.
Right?
And so you can say and if you want to you
know grade these, you would say okay, this
one fraudulent, right, fabrication is fraudulent.
I mean there would be no doubt.
Falsification, maybe some of this and some
of this could be done accidentally.
Maybe.
Ok?
If anyone has any questions, please interrupt
and we'll stop and get some questions answered
and then move on.
But I'll make sure to have everyone out by
1 or a few minutes before okay so just stop
me.
So when I think about ethics, a lot of people
think ethics black or white shades of grey
or orange and white, so you know you have
right and wrong and in this case of course
there's shades of grey that's just wrong.
It's just wrong.
Ok.
I figured since I'm over here it's football
season it's just wrong.
But in fact, most of the decisions that scientists
make are ranging, in the continuum of what
most people would say would be best practices
versus things that are not best or just wrong.
No, no that's actually not.
in fact that was actually , I have a diagram,
it's the self-plagiarism network, right, so,
self-plagiarism would be let's don't talk
about course work, well we can talk about
course work.
If I wrote a paper for a course and then I
decided to actually take that entire paper
because my professor said this is a great
paper you should publish this as a review
paper, that'd be perfectly okay.
It's okay because no one has seen your paper.
Now, if you professor says I want you to write
a paper on X and you though, huh, I wrote
about X 5 years ago, that was a pretty good
paper, I got an A, I'll just send it here.
That's not okay, right?
That would be because of the expectation,
right? and it's all about expectation, I'm
really glad you brought this up because people
who are reading the literature expect that
what you're doing is number one, new, number
two yours, right?
And so the expectation of the faculty member
is that it is yours and it is new, right?
So it has to do with the readers' perspective.
But, you know, if you send me a note, I'll
send you this thing about how the rules of
does and don'ts, so plagiarism, or recycling.
Okay, because if it's wrong it's self-plagiarism,
if it's okay it's just recycling.
Okay?
So there are these kind of grey that most
people would say that these are not sanctionable,
okay.
So for those other ones if you have NIH money
and NIH Office of Research Integrity, if there's
a find by the university it gets sent back
to the ORI that you're guilty of falsification
what can happen.
You can be disbarred from getting federal
funds.
You know maybe your employer will take action.
There are things that happen because of a
finding.
But then there are lots of things that are
in the grey zone like double publishing of
figures and data.
this is really real to me because I'm wrapping
up a review paper and we did use some published
figures as kind of a model and the basis of
some things that we that we did, right?
And so we're citing these, I think in most
instances we're citing the one, if we're copying,
if they're really similar then we're getting
permission, right?
These things are important but it's easy to
screw up accidentally.
Let me say that because there are some figures
that I know I've seen before that we're doing
something that's kind of similar and it's
almost like a bicycle wheel, identifying a
bicycle wheel, it becomes hard to do.
salami slicing, anyone know what salami slicing
is other than the case made, no, salami slicing
is you take a study and then some people think
that the sheer number of papers that come
from the study is most important and so they'll
slice that study up as thin as they can and
instead of publishing one really nice paper
that's useful and gets cited a lot, they might
publish three or 4 or 5 or whatever, they
slice the salami too thin.
it's actually not a great strategy because
a lot of times these skinny little papers
are really never used, read or cited by anybody
and now citations are getting to become more
and more important as a research metric, not
just sheer number of papers.
Okay? so like the data report, this is a biggie
because you have a big data set and you think
ah some of these data don't fit I'm just going
to get rid of stuff and just report on these
data.
Right?
Sometimes it might be a good decision, sometimes
it borders now on falsification goes towards
falsification so this could actually be falsification.
Misappropriating statistics and analyses I
mean a lot of times the peer reviewers catch
that and it gets fixed sometimes not.
You know this is a grey zone.
About exactly the best science you set up
the experiment you say how you're going to
analyze the data you collect and then do what
you said you were going to do.
Some people say but my results are not statistically
significant.
Uh so I'll use another I'll use you know another
statistical method okay?
Mentorship failures, graduate students, so
this probably doesn't apply to any.
But this can happen, conflicts of interest
or bias.
You know, this is another grey zone.
If you're getting funding from a company do
you declare that, where do you declare it
and what publications?
This sort of thing.
Authorship issues ghost authorship and guest
authorship.
At least one university has basically declared
that these are a no-no.
so, ghost authorship is somebody writing the
paper and putting your name on it and this
is kind of something that where you're not
really doing anything and it's kind of like
the guest authorship okay.
I have a Nobel Prize winning friend and so
I decided to well, people take my paper more
seriously if I put his name on it.
And so I call him up and say hey is this okay?
Or I just do it.
so I have one paper that just magically appeared,
although not the Nobel prize winner, it's
just one paper, that kind of magically appeared
without my knowing it where I was an author
okay?
And that's I would say that's not the best
practice and a lot of journals don't want
to do that.
Institutionally affiliation falsification.
Ok so if I decide I thought it might be better
if put you know, put Stanford University some
affiliation there.
So apparently this kind of thing happens.
Anyway, what's most important cites is not
who's right but what's right.
What gets tangled up in this is that because
numbers of papers and grants and citations
and all that's important for my career, maybe
I can boost my career by twisting things a
little bit.
Alright?
Not that I would ever do that of course, but
no, I mean, everybody would think about doing
this.
Yeah, right, yeah, so it's I think I my book
it might be in the conflict of interest you
know because if you're a peer reviewer of
a journal your job is to review the paper,
not actually steal ideas from the paper.
Reject the paper, not that this would ever
happen, reject the paper, steal the ideas
and you do the same thing and publish the
same paper.
Yeah, I mean these things happen.
So the real question here is misconduct getting
worse?
Okay.
And I'm guessing that you all might think
that my answer is yes just because why ask
the question and give the presentation if
it's not getting worse.
But you know there are some, it might not
be getting worse, okay?
Yeah, is it recurring more frequently?
Is research misconduct more frequent?
Well, maybe, but it maybe if we're hearing
about it more then maybe it's just being detected
more frequently.
Maybe research conducts always happen it's
just being detected.
Or it's being reported more frequently.
Or maybe we're just hearing about it more
in the science news and the culture.
Ok and this is certainly the case in in Europe
and I'll tell you why in just a minute.
Or maybe it's all the above.
I don't know.
I'm going to try to answer these or at least
give us an idea about some trends, kind of
what's going on, so to help us answer the
questions.
Alright?
And most of this will be a series of vignettes.
One of my friends, a professional ethicist,
he says he makes the case that we live in
a culture of cheating and I think that that
so that' why I have the IRS here because I
figure everyone can relate to that.
And these are all kind of case studies.
Some of these I cover in the class.
This guy was a cell biologist in Korea Dr.
Huang.
He was actually pretty famous cell biologist
and it ended up that he slash his group fabricated
some data and falsified.
We'll look at her.
This is Dr. Goodwin, university of Wisconsin.
Her lab basically turned her in for cheating.
So these are a couple of cases that we study
in the class.
We're not studying A-Rod.
Of course A-Rod, Alex Rodriguez he's received
a ban from baseball because of a steroid performance
enhancing drug finding that he's appealing
now, so he's still hitting home runs.
He's up to 651 and counting, right?
But, there's good evidence that he might have
cheated by getting a better body.
This was the ironically, the minister of education
in Germany.
These days in Germany, well, okay, in Europe.
In most of Europe, if you're a politician
or hold a political office, you have a PhD.
If you have a PhD, you wrote a dissertation
and your dissertation is probably searchable
and so now, you could imagine if this was
the case in the US, of course it's almost
kind of funny to think of our politicians
as actually being highly educated.
But if it were the case, you could imagine
that if a democrat was running for an office,
the republicans would have hired a plagiarism
seeker and if you can find this guy if you're
democratic opponent has plagiarized you would
have said plagiarism, cheater, you know and
this is what happened with the minister the
education minister in Germany.
I believe they revoked her degree and she
was forced to resign.
This is happened all over Europe with politicians
who plagiarized their dissertations.
Since I'm a country music guy and writing
country music this is Jason Aldean.
Jason Aldean filed for divorce in April and
a lot of the reason is because TMZ, the paparazzi
caught him, that's not his wife.
Anyway.
So this, this slide shows number one that
we are kind of in a culture of cheating.
Number 2 cheating is being, can be detected,
especially by the paparazzi for example.
In other aspects of the cheating so I'll show
you how this all applies to research misconduct
I hope.
So Dr. Huang had a paper in Science, yep,
and what he did was he copied or somebody
in his lab copied and he knew about it apparently,
copied parts of images he'd shown, he'd made
the big result that he could clone cell lines
and actually, so it's basically a cloning
finding.
Able to clone.
Alright, and so, he made the finding is that
he had a lot of clones okay when in in in
reality he might have had one or two but he
inflated the finding.
He was fired from his position.
I guess he picked up another job in another
Asian country, but he was like, he was a rock
star scientist in Korea.
This was a big guy.
This guy he had some, there were some other
ethical things like he had women believe it
or not, rock star, he had women lined up to
donate eggs for his research.
Now this is an invasive procedure, right,
so this is like, this is rock star stuff.
And so, he kind of got in trouble on the bioethics
side too.
So one of the things I talk about in the course
and teach in the course is for example rules
of rules for changing images.
So I tell the students okay back in the old
days, we would use Polaroids, take pictures
of our shelves, we might chop off some of
the ugly parts with scissors or whatever,
but then we'd publish what we basically had
because we couldn't do anything else.
Now, you can do lots of stuff okay?
And I'll start here, and this is one warming,
beware of over beautification.
Because now the journal editors, journal reviewers
and even the scientists themselves all want
pretty data.
They do, they want pretty data and you can
make them very pretty.
But anyway, so you know, don't combine images
unless it's clear they were combined.
Manipulation should be done to the entire
image, not just a part.
Of course, Waung's paper actually cut off
part and used it in some other place.
I mean that's kind of going beyond.
But these are things that unless you really
think about, you know, maybe you could do
these things accidentally, right?
Because you want to pretty picture.
But you know, there should never be an attempt
to deceive.
And so in the class we go over case studies
along with some foundational material, which
basically helps us to think about research
integrity in the age of digital manipulation.
Okay.
That's a really important take home point
for young scientists, old scientists too.
Okay, so now to the meat of it.
Is there more research misconduct happening
now?
Is it being reported more?
Or, you know, what's actually happening?
And so, once again, Gary Comstock basically
got these data from his 
own office of research.
And they estimate that less than one percent
of research misconduct is reported.
Okay, less than one percent.
You can think about your own laboratory experiences.
How and when you would actually report something.
People don't want to report things.
Especially research misconduct and as we'll
see with the other vignettes here because
maybe your future can be affected by it.
Alright?
So Less than one percent is reported.
If the Journal of Cell Biology estimates that
about 20% of papers at least in the area of
cell biology contains questionable data, so
what is questionable mean?
It could mean falsified, fabricated or maybe
some of these other things that we talked
about in the previous slide.
Data that are manipulated to show something
that might not be exactly how it was in the
laboratory.
So is there under reporting?
I mean there's probably a lot of under reporting.
Is more being reported now?
Maybe.
And here's a case this is very very recent.
And so what happened is that, so this is the
title of the slide - If you don't have the
data just make it up because you're not going
to see this, but basically this is part of
a chemistry paper.
This was a paper that was published in one
of the American Chemical Society Journals.
And for whatever reason, a note from the boss
to the student says "Emma, please insert anymore
data here where are they?
And for this compound, just make up and elemental
analysis."
Okay.
So it's like we know it exists, we know it's
there, we have to get this through peer review,
just make it up.
So here's the, and this is where the boss
is telling the student, don't worry, it's
a small detail, just make it up.
Anyway, this comment somehow ended up in the
draft that was published online.
Oops!
So this is like the worst kind of interpretation-
just make up the data.
It might be, they were telling them to do
the experiment and insert it.
But I add that one here there, just before
this one because this was a kind of a nice
study that was done in 2005.
This was a survey I've cut off the title.
It's Scientists Behaving Badly if you want
to look it up it was published in Nature in
2005.
It's a short paper and what they did is they
surveyed post-doctoral associates who were
recipients of NIH post-doc awards.
So these are good scientists- post-docs right?
Being able to win these NIH post-doc fellowships.
And they also surveyed scientists that were
a little bit more advanced in their careers.
These I think were associate professors.
They're calling it mid-career and early career.
Okay.
So the mid- career and early career and they
asked these questions and the questions were
have you done any of these things in the last
3 years.
You see the sample size, pretty big sample
size, 3,000 and some.
And so #1 you can see these numbers here the
self-reporting numbers of falsifying or cooking
research data.
So this is 3.%.
3 out of 1,000.
3 out of a 1000 falsification.
Ignoring aspects of human subject requirements.
So it's a bioethical concern.
Maybe not one that is conducive to all research
ethics, but that's a you know, that's a big
deal.
Not properly disclosing involvement in firms
whose products are being based on one's own
research.
Remember this is NIH so it is biomedical right?
So that could be important.
But then look at some of these.
Using another's ideas without paying, permission
or due credit.
So now it's 1.4 1.7 %. Unauthorized confidential
information.
Okay, so this is this idea where you're reviewing
a paper, peer-review, you're supposed to not
use it and say, hey- great idea!
I'll do that.
Failing to present data that contradicts one's
own previous research.
This is a killer one right because you have
to have a certain amount of humility to do
that or at least you don't, be being right
is more important that what's right at this
point.
But look at these numbers.
6% 6.5%.
Overlooking others' use of flawed data.
So this is the non-reporting issue, right.
This is overlooking other's use of flawed
data.
12.5-12.2%.
The thing that I want you to see here, and
there are some other things, withholding details
of methodology and that's important.
This is inappropriately citing authorship
credit.
The mentorship, authorship thing, right.
So these are pretty big numbers.
So what I want you to see is that for the
early career, for these that are statistically
different the mid-career, the older scientists
is doing the bad stuff more frequently than
the early career scientists.
Okay.
20.6 for example was changing design methodology
results of a study in response from pressure
from a funding source more so than the early
career, it doubled, right.
So as it's kind of sad to say, but is the
older I get the worse I behave.
For sure, yes.
For this one, maybe so, right.
I would like to think that they'd taken my
class and they are behaving better, right.
These guys never had my class because they're
too old.
But no, you're exactly right.
For some of these, yes.
You could certainly say there are different
pressures or opportunities for older scientists
than there would be for younger scientists.
Right and this is probably the most controversial
of these items because it could be interpreted
in many different ways.
If it's results most people oh that's just
wrong.
That should have been its own item, right.
But I submit a grant proposal to the USDA
and I ask for a half a million dollars and
they give me a quarter of a million dollars
and they say lop off some of the objectives.
I'd say okay.
Is that- would I answer yes to this?
Well, maybe.
Maybe, yeah.
So this one is this one is the most controversial.
You picked, you did pick a good one to focus
on to really see if these data are, you think
these are valid.
But most of the other ones.
It does point that scientists get more savvy
about what they can get away with.
I'm going to say that.
The other piece of evidence, the big piece
of evidence that there is more misconduct
is that there's a big rise in retractions.
So there is a there is a group, I'll show,
give attribution to them in a few minutes,
called retraction watch.
A couple of science journalists who started
retraction watch.
Now not all papers that are retracted are
retracted because of misconduct.
Some of them can be retracted because of honest
errors.
Okay.
All things being equal, this is still not
a great sign.
For the editorial the papers where I've been
in evolved as an editor that have been retracted,
they've been misconducts, not honest error.
But this is a huge, this is a huge increase,
right.
I mean, look, .005%, you know, versus 10 basically
a ten-fold increase.
Once again, is it more misconduct or are people
more aware that there could be problems.
Or is more being reported.
It's hard to say okay.
I'll talk for just a minute about the Wisconsin
whistle blower case because #1 there aren't
many well-known whistle blower cases where
the faculty member here is turned in by her
laboratory.
In this case, she wrote a grant proposal and
it contained a gel figure from a paper that
was being published and she basically changed
the caption.
Made it about another protein.
She ended up talking with a post-doc in the
lab, which might have been a technician, but
a post-doc, a senior kind of a senior person
in the lab.
This person was concerned enough that she
talked with another person in the lab another
PhD student I think and from there they had
a lab meeting and they said wow this is kind
of bothersome.
And then they reasoned it all out and they
said, well, if we turn her in, she's not going
to like it.
What if we're wrong?
Right?
It's going to bad for us.
So Mary Allen was the student who basically
said, no this is, what she did was wrong.
Whether it's going to be good for us or bad
for us I think I'm going to turn her in. and
this is all kind of how it played out.
And finally they all agreed okay if you're
going to go we're going to go we're all in
together.
And so, they turned her in.
She resigned and when she resigned the University
of Wisconsin.
Apparently the dean the department, someone
in administration said you did the right thing,
don't worry, we'll take care of you.
Okay, which was kind of temporary because
as it is in the university, these people were
now toxic.
They were viewed as basically as toxic.
Nobody wanted to mentor these people.
Okay.
So they all left.
They all left.
I think the only one who finished her degree
is May Allen and she finished it someplace
else.
So it whistle-blowing within the university,
so this going back to the reporting thing,
is very, very difficult okay, even with whistle-blower
protection.
Since students are basically apprentices,
once the boss is gone, the student is also
gone, pretty much.
Especially when they turn in the boss.
I had this vignette.
Once again, these are all, the Mary Allen
case and the Goodwin case is kind of old.
This is a 2013 case.
This something kind of blew up.
This guy has retracted 54 papers.
He's a Danish psychologist.
Deadrick Staple.
The thing I really look, this came out of
retraction watch, the retraction watch website.
The thing that I really find interesting about
this one is that he's now gone onto the TED
to talk about what happened to him.
54 papers!
And basically what happened is he just made
up the data.
So this is all data fabrication, right.
This, he just made it up.
And he says, among other things, he says he
lost his moral compass, but now it's back.
So, hire me, I'm a nice guy kind of thing.
I don't know, I've not seen the talk, but
this will be interesting.
And so, is there, this is kind of a society
that we live in, right.
On my culture of cheating thing I used have,
it's football season, but it's old news.
The former Arkansas football coach, right.
Do you guys see the story there?
He was cheating on his wife, he got in a motorcycle
wreck the girl he was cheating with was on
the back.
And then he lied to the athletic director
and he got fired.
Not so much for cheating, but for lying and
mainly because the razorback fans might be
upset.
And you know.
So, the thing is, but he's now got a job at
our next football opponent- Western Kentucky.
So he's been the Western Kentucky football
coach so we'll see him, probably sans girlfriend.
That's, I think that's pretty much died.
So why would there be more cheating today
than 10 years ago, 20 years ago.
But I'm not saying that there is more research
misconduct.
I think there might be and if there is these
are the reasons I've heard where people who
might be mostly honest enough, right, and
good scientists, might shade towards doing
things that are not the best practices.
How's that for a politically good way to say
this, right.
one is, of course we live in a publish or
perish culture in the university.
If I'm going to get tenure, I should have
publications.
Not only publications, but publications in
the right journals that get cited a lot.
They gotta be good studies right, And here's
what you're bosses will tell you know because
I'll tell my own students this.
They have to tell a good story.
You've got to tell a good story.
Well how do you tell a good story?
Well, one way is to make up the story.
But there's the publish and perish and related
to this is the grant competition.
The grant competition is fierce.
I just, the grant I was telling the guys,
one of my grants just recently got rejected
and it was like they funded 6 they had 89
proposals or so.
So how do you get the edge?
Well one way could be what Dr. Goodwin did
at the University of Wisconsin is that you
tell a good story even though maybe the story's
not all there.
There are emerging players in international
science and this is creating some stress among
scientists.
So for example, in many Chinese Universities
you have to publish your paper in an international
journal with a certain impact factor.
So it has to be, you know, a pretty good journal,
right, or you don't graduate.
We don't have these kinds of we don't have
these kinds of institutional rules here in
the US, but they do in a lot of places in
China.
And I think in some other countries they're
saying hey- you've got to do this.
You're not going to advance in your career
if you don't do this.
And that creates pressure.
There's also pressure for funding.
One country has, is trying to build a science
culture, I won't talk about this country,
but this country is trying to build a science
culture, and their funding, their putting
a lot of money into grants.
And they're funding about three quarters of
the proposal, right.
So the in the US, the USDA, the NSF, they
are funding 10-20 percent, we'll they are
funding like two-thirds of these proposals.
So all you have to do is write something and
you almost get it funded.
And so, these are things that haven't happened
in some countries before.
So there is this pressure to perform.
And then there's this, what I term is the
thin line of decisions.
Do I do it the best practices, or do I fudge
a little bit, it's not a big deal, am I going
to do it.
And then you can fudge a little more.
You can think about nay area of your life
where there's a temptation to cheat, this
is how it works, right.
Just think about the IRS, tax returns.
I'm going to wrap up here.
And I like to always talk about this story.
This is Frank Abignale became a master shyster.
He became a physician, he became an airline
pilot.
He did all these things without any training.
He just did it; He's a very sharp guy no doubt.
Made a movie.
So here he is Leonardo DiCaprio.
But he said this is the back of his book:
"We don't teach ethics at home; we don't teach
ethics at school because teachers would be
accused of teaching morality."
So what is that?
Ethics is applied morality.
Research ethics is applying some sort of morality
to the practices of doing science.
Yeah okay, am I teaching morality, yeah, okay
maybe so.
We have to look at the long and hard to even
find a college course on ethics and so, you
know, we do have a college course on ethics
and there are some other ones at UT.
So with that, I'll kind of wrap up here and
try to guess at some answers.
Is misconduct getting worse and occurring
more frequently, I think probably so it is
occurring more frequently, at least world-wide
because of pressures to perform.
New players in science who may not have thought
about this.
I mean, even my own student, remember, he
really hadn't thought about ethics and I hadn't
thought about research ethics and misconduct,
you know.
So I think understanding the rules of science
and teaching this is important.
Okay.
Is it detected more frequently?
Of course.
Now when I submit a paper to certain journals,
I know they're going to do plagiarism.
They are going to run it through plagiarism
checker.
So what do I do?
I run it through a plagiarism checker first
to make sure none of my students plagiarized
anything.
Same thing for image analysis.
It's getting easier to detect images that
have been manipulated.
So once again.
I've got a lab of like 40 people now, 40 full
time people.
I'm thinking that's too many, I can't, I'm
not a micro manager, but I can't keep up with
it all, right.
I try, and so, I'm using tools.
I'm using, all my lab meetings I talk about
research ethics.
I talk about right and wrong and I say you're
going to get cooked if you do it wrong.
So we do image analysis, we do plagiarism
checking.
But we do have these defense maneuvers that
we hope influence behavior.
Is it being reported more frequently?
Maybe, maybe not.
It's hard to say.
There are no rewards for reporting bad behavior.
None.
There are no rewards in science.
None.
Is it more apparent in the news culture?
Yeah, I do think people are thinking about
it more.
Certainly these big sites that end up being
false, create news, they do create news.
Is that all?
But, you know, maybe half, three out of 4.
But it's a problem we're going to have to
live with I think.
I don't think we're actually going to get
rid of research misconduct, but we can mitigate
and manage it and hopefully educate against
it.
And that's the job of every scientist.
