Good morning, and welcome to today's live webinar for the release of The Nation's Report Card,
Science 2011 at Grade 8.
I'm Cornelia Orr, Executive Director of the National Assessment Governing Board, and moderator for today's event.
The Governing Board is an independent  bipartisan board that sets policy for the National Assessment of Educational
Progress, also called NAEP or The Nation's Report Card.
As you know, report card releases shed light on very important topics and trends in our education system,
and science is no exception.
Today's event is sure to provide valuable insight as we compare results from the 2009
and 2011 science assessments at Grade 8;
as we explore data from physical science, life science, and earth and space science;
and as we review background questions that provide helpful context about students' academic performance.
Before we begin today, I'll briefly run through the order of our speakers
and then our webinar producer will review the meeting logistics.
Our agenda today includes three speakers.
First we will hear from Jack Buckley, Commissioner of the National Center for Education Statistics,
who will present the NAEP Science 2011 Grade 8 results.
Then Hector Ibarra, a National Assessment Governing Board member and award-winning middle school science teacher,
will discuss creating a learning environment that challenges students
and boosts achievement as well as the benefits of an inquiry-based teaching approach.
His own inquiry-based teaching methods have been recognized by the Milken Family Foundation.
And finally, Jeniffer Harper-Taylor, President of the Siemens Foundation,
will address the need to take a comprehensive approach to advancing STEM education
and touch on popular Siemens' initiatives designed to train and encourage team participation in science studies.
Following Jeniffer's remarks, we will have a brief question-and-answer session with all attendees and speakers.
Before we begin our presentation, Rick, our webinar producer, will address logistics for using the WebEx system.
Rick?
Thank you, Cornelia.
Our speakers will address questions during a Q&A session later in the event,
but attendees are welcome to submit their questions throughout the entire presentation.
Simply type your question into the Q&A window on the lower right side of your WebEx screen,
and submit your question to all panelists.
Please be sure to include your name and organization with all your questions.
If you have technical issues, please refer to your confirmation email or call 866-229-3239.
Thank you.
Back to you, Cornelia.
Thank you, Rick.
Before we begin,
I want to let people know that the Governing Board added NAEP Science 2011 to the usual four-year testing cycle so
that comparisons in 2011 could be made with performance of U.S.
students on the latest Trends in International Mathematics and Science Study, or TIMSS.
Results of the 2011 TIMSS assessment will be compared with NAEP Science results
and Mathematic results in an upcoming study slated for release much later this year.
The study will also include the results allowing states to see, and the public to see,
student performance in both subjects for states in relation to other nations.
I also want to emphasize that this is a voluntary state assessment, and that all 50 states
and the District of Columbia participated in 2011.
This was a first-time occurrence for a voluntary assessment
and previously it had been in the neighborhood of 40 to 45 states who volunteered to participate.
So I want to, first of all, thank all of the states who did participate in that.
Now it's my pleasure to introduce our first speaker.
Dr. Jack Buckley is Commissioner of the National Center for Education Statistics,
on leave from his position as a Professor of Applied Statistics at NYU.
He is well known for his research on school choice, particularly related to charter schools
and on statistical methods for public policy.
Jack spent five years in the U.S. Navy as a Surface Warfare Officer and Nuclear Reactor Engineer,
and he also worked in the intelligence community as an Analytic Methodologist.
Jack, thanks for being here.
Thanks very much, Cornelia.
And good morning.
I'm very happy to be here to share with you the results of the NAEP 2011 Science Assessment that we're releasing today
in the Science Report Card.
As Dr. Orr mentioned, we assessed in 2011 in Science,
only students at the eighth grade level with a sample size of approximately 122,000 students.
We are fortunate in that we have results available, not only for the nation but also, as Dr. Orr mentioned,
that students from all 50 states, as well as the District of Columbia
and the Department of Defense school systems participated in this assessment,
which actually marks the first time that all 50 states
and the District agreed to participate in a voluntary science assessment.
Also, as Dr. Orr mentioned,
the Governing Board added Grade 8 Science to the assessment schedule in 2011 specifically to create an opportunity to
study the relationship between U.S. results in Science on the NAEP, but also how the U.S.
stacks up to other countries on the Trends in International Math and Science Assessment -- or Study rather, the TIMSS,
and we will be releasing results from that analysis in December.
As always, on this assessment, as in all other NAEP assessments, we report results in two ways, average scale scores,
in this case on a zero to 300 point scale, and as percentages of students attaining at
or above the various achievement levels which are set by the National Assessment Governing Board.
These achievement levels, the basic, proficient, and advanced, tell us what students should know and be able to do.
Also in this report we're able to make comparisons back to 2009.
Unfortunately we can't make direct comparisons back to prior years because of changes to the content, framework,
and also the achievement cut scores for the 2009 Math Assessment.
The 8th Grade Science Assessment covers science content in three areas, the physical, life and earth
and space sciences, with content distributed approximately 30% each to the physical and life sciences
and 40% to the earth and space sciences.
So looking at the results overall,
on average we see that eighth graders in science in 2011 increased their score two scale score points compared to the
results in 2009.
When we also turn not just to look at the average but look across the percentile distribution,
we can see that students at varying ability levels also increased their scores compared to 2009.
So students at the 10th, 25th, and the 50th and the 75th percentiles
all showed improvement on average, although students at the top,
the 90th percentile, statistically we see no improvement there.
As you'd expect from those scale score results, we see a similar pattern in the percentages of students at
or above achievement levels.
We see an increase between 2009 to 2011 in the percentage of students at or above basic level,
which is the bottom of those bars from 63% to 65%, and an increase -- statistically a significant increase,
of two percentage points from 30% to 32% in students at or above proficient level,
although recall since there was no gain at the 90th percentile, here we see, again,
no change in the 2% of students at the advanced level in both years.
2011 was the first time that in NAEP we were able to disaggregate the race
and ethnic categories of students in accordance with the directive from OMB, the Office of Management and Budget,
in line with the census of 2000.
What this means in terms of NAEP is actually for the first time in 2011, for our reading and mathematics
and now for science results, we're able to disaggregate Asian students from native Hawaiians
and other Pacific Islanders.
So you can see here the average Asian student scored a 161,
which is similar to the average white student score of 163 on the scale.
Both groups outscoring black and Hispanic students, but you can see the distinction
or the difference between Asian students and native Hawaiians or other Pacific Islanders,
which we were not able to report on previously.
We're also, for the first time, able to provide results disaggregated for students of two or more races.
And I should just point out as well, to the left of the bars, the number there, labeled Percentage of Students,
shows us for the eighth grade population that we assess what percentage of students were in each of those categories.
There is good news in general for achievement gaps in science in these results.
What we see here is that the gap of scale score points between white
and black students actually narrowed between 2009 and 2011 for eighth grade science, dropping from 36 points to 35,
and also, importantly, the way that the gap changed
or closed was by both groups increasing their average scores but by black students increasing their scores slightly
more than white students.
A similar pattern, although a more dramatic closing of the gap, the white and Hispanic score gap,
which closed from 30 scale score points to 27, again with both groups improving on average.
In the 2011, 8th Grade Science Assessment we see evidence of a persistent gender gap, which we have noticed in prior
assessments between boys and girls, with male students continuing to outscore female students in science on average.
Also something we see across all of our other NAEP assessments,
there continues to be a gap between students of varying income levels.
As always in NAEP we measure the socioeconomic status of the families that students come from using eligibility
for the National School Lunch Program as a proxy, so measuring whether or not they're eligible for free
or reduced price lunch.
And what we see here is that students from families who are eligible for free
or reduced price lunch were outscored by students from families that were not eligible,
the top bar showing the difference in 2009 and the blue bars on the bottom showing the difference in 2011.
There are some slight changes there, but it's really the same pattern that we've seen before
and that we see across all our other assessments.
Similarly there's a persistent gap between students attending private, in this case private including Catholic,
schools, the middle set of bars, or just Catholic schools and students attending public schools,
which is shown on the top bar, although, of course,
recall that this is not necessarily evidence of improved instruction in these schools as much as there could be issues
of selection in terms of what students attend what type of school.
So when we turn from the national results to the state results, again we see in general a pattern of good news.
Where there were any significant differences looking from 2011 back to 2009, they were improvements.
So we see, in fact, 16 states scored higher in 2011 for eighth grade science than they did in 2009.
And then comparing just within 2011, if we compare states to the national average,
we see that students in 29 states and jurisdictions actually outscored the nation in 2009.
There were a handful of states that were not significantly different.
And then the remainder, of course, scored lower than the national average.
As always, in addition to the assessment items, we collect data on a wide variety of background questions,
additional factors about the students, about their teachers, their background, educational experiences.
This is an example where we look at the average scale score for students depending on how often they reported doing
hands-on projects in class.
The teachers were reporting the instruction in this case.
So what you can see here is a pattern where 2% of the students reported never
or hardly ever doing hands-on projects in science had the lowest average scale score on NAEP, where at the top,
16% who were reporting performing hands-on projects and science instruction every day
or almost every day had the highest average scale score, although, again, I would caution you this is a correlation,
but there's not evidence that, for example,
performing more hands-on projects necessarily leads to improved achievement because there could be other factors
correlated with both likelihood of doing hands-on projects in science as well as achievement.
Similarly, one of the background questions that we asked the students was how often, or whether
or not they do science-related activities outside of school, so things that are not directly for schoolwork.
So in this case, the scale was just asking do they agree with that statement,
and you can see that students who more strongly agree with the statement that they do science-related activities
outside of schoolwork on average also score higher on the assessment.
So now we turn to some sample items from the assessment.
This first one, from the Earth and Space Sciences section.
And what we asked here is a diagram that shows a collision of two tectonic plates in Asia.
And we asked students to select the answer that identified the most likely result of the collision.
Seventy-two percent selected the correct response, C,
that the collision caused the Himalayas to increase in height each year.
For an example of a Physical Science item,
here we asked students to identify the atoms in their proportions in a molecule of water.
And we see here that 54% chose the correct answer, again C, that two atoms of hydrogen
and one of oxygen combine to make a molecule of water.
It's interesting to note here that a large number, 34%, of students chose the incorrect response, option B,
possibly because they read the item across looking like H2O as one hydrogen, two oxygen.
And finally here's a somewhat more complex example of a,
what we call an extended constructed response item from the Life Science section of the assessment.
So here students were given the results of an experiment involving the immature forms of mosquitoes, the larvae
and the pupae.
And these correspond, of course, to the caterpillar
and chrysalis stages of a butterfly except that the results are not as pretty.
So both the larvae and the pupae live in water at the surface,
and both are capable of diving below the surface to avoid danger.
This extended constructed response question described to students an experiment involving a larvae and a pupae,
each placed in a jar of water 50 centimeters high.
The experimenters then tapped on the glass and measured both the depth to which the larvae and pupae descended
and the time they spent below the surface.
The experiment was repeated five times and the results averaged.
Students taking the assessment were then given a chart summarizing the findings as shown here.
So for the larvae the average depth reached was 22 centimeters,
while the average length of time spent under water was 90 seconds.
For the pupae, the average depth was 38 centimeters and the average time was 120 seconds.
So students were then given a list of six statements and told to select all the statements that were true.
Statement D, the pupae dives deeper than the larvae, and statement E,
the pupae stays under water longer than the larvae, were true statements.
But not only did we want to see if they could interpret the results from the experiment,
we then actually asked if they could explain why they selected the statements that they chose
and support that with data from the table.
So these answers could be scored as complete, essential, partial, or unsatisfactory or incorrect.
Complete responses selected only statements D and E
and had to refer to the data in the table to correctly explain both selections.
Essential responses selected only the correct statements D and E,
but used data from the table to correctly explain only one of the selections.
Partial responses might have mixed that up, maybe they got D and not E, maybe they could only give little
or no explanation for one of them, or might have included something incorrect from the table.
And, of course, an incomplete response did not select either of the correct D or E
and also could not justify their answer.
To use an example of a student's response who was graded as complete.
So this student was assessed to have complete understanding of the behavior of mosquito larvae and pupae.
As you can see from their answer, the student stated,
"I think that the pupae dives deeper than the larvae because the pupae dives 38 centimeters and the larvae dives 22."
"I also think the pupae stays under water longer than the larvae because the pupae stays under water 120 seconds
and the larvae stays under water 90 seconds."
So clearly they were demonstrating they were able to read the data from the previous slide.
Another student receiving a complete rating gave this answer, the second one.
The pupae stays under water longer than the larvae does by 40 seconds.
The pupae also dives deeper than the larvae by 16 centimeters.
In looking over the response distribution, we see that 15% of students give a complete answer,
while 2% an essential answer, and 32% gave a partial.
So the 2011 Science Report Card provides all this information and much more.
In addition, the Initial Release website gives extensive information on the performance of students,
access to release assessment questions through our Question Center, and as always the NAEP Data Explorer,
our online data analysis tool, that lets you look at the background questions and compute analyses on your own.
In conclusion, as always, I'd like to offer my sincere thanks to all the students, teachers
and schools who made this possible by participating in the 2011 Science Assessment.
Thank you.
Thank you, Jack.
Our second speaker, Dr. Hector Ibarra, is nationally lauded for his expertise in science.
Before accepting his current position as a middle school science teacher at the Belin Blank International Center
and Talent Development at the University of Iowa, Hector taught General
and Earth Science for 30 years at West Branch Middle School in Iowa.
Among his many awards is the prestigious Milken National Educator Award.
Thank you for your time today, Hector.
Thank you, Cornelia.
As a middle school science teacher with 36 years experience,
I quickly look beyond the scores when reading the NAEP Science Report Card.
It's good to see an overall increase from 2009 to 2011 among our eighth graders with the percentages of students
scoring at our above the basic and proficient achievement levels increasing over this two-year period.
But it is important to go beyond the trend results and examine factors that influence improved science achievement.
A large part of this picture is how teachers are teaching and how students are learning.
How do we use the information from the NAEP Science Report?
Please refer to the next four slides.
First, we see how important it is for students to do hands-on activities.
As part of the 2011 NAEP Science Assessment, teachers were asked how often their students did hands-on projects.
The NAEP data reported that students whose teachers used hands-on activities in the classroom do better on the NAEP
than students whose teachers don't use activities.
Students whose teachers answered never or hardly ever had an average scale score of 140,
while those who said they did one or two activities per week had an average scale score that was 14 points higher.
Second, being involved in science-related activities outside of the school is often taken for granted.
But it does make a difference.
The report again sheds light on this.
Eighth graders who took the assessment were asked to indicate the extent to which they disagreed
or agreed with the statement, I do science-related activities that are not for schoolwork.
Twenty-five percent of the students answered strongly disagree, and the group's average score was 146.
At the other end of the spectrum, only 4%of the students strongly agreed,
but their average scale score was 162.
And the next slide, third, it is important to have students work together on projects where they discuss ideas
and work as a team to find solutions to the problem.
Teachers reported their students worked together on science projects one
or two times per week scored higher than students whose teachers reported their students never worked together.
Four percent of the teachers reported their students never or hardly ever work together
and had an average scale score of 147.
Forty-seven percent of the teachers reported their students work together one or two times per week,
their scale score was 153.
Again, it is no surprise to me that students who do hands-on activities, work together in science class,
and do science projects outside of class all scored higher on NAEP than students who didn't.
Fourth, you can see on this slide, writing in science is important.
Constructed response questions required students to explain their answers.
Constructed response questions challenge students to use critical thinking skills, scientific inquiry,
and problem-solving skills in developing their answers.
Students organized their thoughts and findings in written form.
This increases their understanding of science and their skills in writing.
Fifteen percent of the students were scored as having complete responses while 50% were scored as having
unsatisfactory or incomplete responses.
I believe if teachers require students to write more often,
there would be an improvement in student performance on NAEP constructed response questions.
From my perspective as a teacher,
an important aspect of the data is how students are engaged in doing hands-on activities in class.
There are many ways students are exposed to science.
Some of them are more effective than others.
The question to ask is, are we creating a learning environment that truly challenges the students' skills
and boosts achievement?
The National Science Education Standards were developed by the National Research Council in 1996.
Inquiry was a major component of the standards.
There are many forms of inquiry, from guided to open inquiry teaching approaches.
Students are more in charge of their learning as they find solutions to their questions.
However, doing hands-on activities in the science classroom doesn't necessarily mean the activity is inquiry based.
Many activities done in science classrooms tend to be cookbook-type of experiments where students learn to follow
directions and are told what to do.
In my classroom, I started inquiry-based projects in the late 1980s, well before the national standards were developed.
I revised the cookbook experiments to include questions to help guide the students' learning process.
With inquiry-driven activities, students are provided opportunities to expand their learning, to solve problems,
and to use their creativity to arrive at solutions.
An inquiry-based teaching approach dissuades mere memorization
and requires comprehension of what it takes to arrive at possible solutions.
The Series Circuit Experiment slide that you see is an example of a hands-on inquiry activity I use with my sixth
grade class.
This is a common science experiment for studying series circuits.
I revised the experiments to present students with a problem statement, general instructions,
and some of the questions that you see that are answered only by experimenting to come to an understanding about
series circuits.
A commitment to inquiry in the classroom represents a profound shift from the comfortable methods some teachers use.
Inquiry teaching approach has the student, not the teachers, in charge of their learning.
Students are asking questions, and teachers are redirecting the questions with other questions.
Students who win local
and national science competitions are critical thinkers who have learned how to solve problems on their own.
We must ask ourselves, are we ready to fundamentally change what we do and how we teach in classrooms
and how prospective teachers are taught in university teaching programs.
Successful science programs are often a combination of things, from partnering with different entities
and organizations that bring in models
or offer free lab time for students to having students take the lead on experiments
and determine possible solutions to the issues that are being investigated.
However, it's not just about having equipment and resources.
It is more about how we think about teaching and learning.
The most important idea students can learn in class is to be passionate about the process of learning.
Science is such an important part of our everyday life.
Educators would be wise to provide the best environment we can make to make science come alive for all students.
Thank you.
Thank you, Hector.
Our final speaker today leads one of the nation's preeminent nonprofit organizations dedicated to science, technology,
engineering and mathematics education.
As President of the Siemens Foundation,
Jeniffer Harper-Taylor has introduced tens of thousands of young people to opportunities in STEM,
and she is active on a number of fronts to close the minority achievement gap in STEM education.
We look forward to your insights, Jeniffer.
Good morning, and thank you, Cornelia.
Engaging students nationwide in science, technology, engineering
and mathematics has been my focus with my work at Siemens for more than a decade,
so I'm honored to have been invited to comment
on the 2011 National Assessment of Educational Programs in Science at Grade 8.
Science, technology, engineering and mathematics, STEM, education is vital to our students' and our nation's future.
The jobs that are growing the fastest here and around the world are by and large in STEM fields.
Over the past decade, the number of STEM job openings grew three times faster than total non-STEM jobs.
Moreover, our ability to meet the global challenges of the Twenty-first Century depends on STEM education.
We need to train the next generation of inventors, doctors and engineers,
and that requires a serious commitment to improving STEM education.
Looking at the results of the latest National Report Card,
I was pleased to see that eighth grade performance in Science has improved over the past two years,
with the average science score two points higher in 2011 than in 2009.
This may not represent a huge leap forward, but it is definitely a step in the right direction.
I was also glad to see the score gaps between white and black students,
and between white and Hispanic students, narrow.
Closing the minority achievement gaps in STEM in hugely important
to our nation's long-term ability to stay competitive.
Although public school students did score two points higher in 2011 than in 2009,
I was disappointed to see no significant change in the gap between private schools
and public school student performance over the last two years.
But this only underscores that the need to take a comprehensive approach to advancing STEM education with a national
commitment to flexibility and innovation.
I was also not surprised to see that students who do hands-on projects in class are more frequently scored higher than
students who do hands-on projects less frequently.
Engaging students in hands-on science is the cornerstone of our work at the Siemens Foundation.
Our Siemens Science Day program is a public-private partnership that brings hands-on student science learning
and teacher professional development into public schools across the nation,
and they're aligned with national teaching standards.
We launched this program in 2005 and have reached over 30,000 students nationwide to date.
I was glad to see about two-thirds of the students work together on science projects at least weekly.
In the real world, science is almost always done in teams, and it's important that we encourage teamwork in science.
Our Siemens We Can Change the World Challenge was designed specifically for students to compete in teams.
This sustainability challenge empowers students at the elementary, middle school
and high school levels to create solutions to today's environmental problems.
Students have come up with some incredibly impactful ideas for this challenge that have been implemented by schools,
communities and organizations nationwide.
Making science relevant to students' real lives and concerns is an important goal for us at the Siemens Foundation.
Our Siemens competition in math, science and technology also encourages and rewards team participation.
This national competition gives high school students the opportunity to perform original, hands-on research
and present their work to some of the leading scientists and educators across the country.
We believe at the Siemens Foundation that challenge-based learning
is essential to getting young people excited about science.
I was also not surprised to see that students who report doing science-related activities that are not for schoolwork
score higher than their peers.
Students need to engage in science throughout their daily lives and not just in the classroom or a part of homework.
Many of the students who have completed successful competition projects for the Siemens competition in Math,
Science and Technology, have told us about their parents' help, mentors,
and people in the community that have helped make science a part of their everyday lives at a very early age.
We need more parents and community leaders and mentors to encourage their children's interest in science inside
and outside of the home.
I'm pleased to see the positive momentum in eighth grade performance in science,
but we really still have a long way to go.
We need to create a culture that celebrates science at all levels of our society.
We need to encourage students to do more hands-on learning, challenge students to do original research,
and encourage our students to work in teams.
We need to do a better job at making STEM topics relatable to students and their everyday lives.
And we need to do a better job of recognizing and rewarding the educators and students who are successful in STEM,
and make sure that their work does not go unnoticed.
Again, I'd like to thank you for the opportunity to comment today on behalf of the Siemens Foundation,
and now I'll turn it back over to Cornelia.
Thank you so much, Jeniffer.
Now we will respond to attendee questions during a brief question and answer session.
Please submit your questions online, and our facilitator, Amy Buckley,
will direct the questions to the appropriate speaker.
Now I'll turn the session over to Amy.
Thank you so much, Cornelia.
For those of you who have questions about today's Report Card results, please submit them now.
As Rick mentioned, we ask that you direct your questions to all panelists,
and please remember to include your name and organization when typing in your questions.
As a reminder, we have Dr. Cornelia Orr, Dr. Jack Buckley, Dr. Hector Ibarra,
Jeniffer Harper-Taylor available who you heard from today,
as well as Dr. Peggy Carr, Associate Commissioner with NCES,
all are available to answer your questions.
Finally, we will be mindful of everyone's time, so if we are not able to respond to your question during the event,
please know that we will via email.
Thank you so much for your interest in today's webinar.
Our first question comes from Mark McCaffrey, a Programs
and Policy Director with the National Center for Science Education.
Mark asks, Do you have a sense of whether and how climate change
and other human impacts on the Earth's environment are covered in eighth-grade science overall?
These topics are often missing or not well covered in existing state standards.
Cornelia, could you address that, please?
I'll be glad to.
This topic is covered at all three grades, four, eight and twelve.
Of course we're talking about grade eight today.
The NAEP Science Framework, which is available on our website, covers these topics in the Earth
and Space Sciences content area.
At grade eight the specific objective describes how human activities have changed the earth's land, oceans
and atmosphere, often in negative ways.
And you can find these contents in the other grade levels as well.
Great.
Thank you so much.
Our second question comes from William Bertrand,
a Technology Education Advisor with the Pennsylvania Department of Education's Bureau of Teaching and Learning.
He comments, I understand that there was a technological design component to the test of 10%.
Why is this not being reported on like Physical Sciences, Life Science and Earth and Space Sciences?
Jack, could you address that?
Sure, it's a great question.
So, as I mentioned, we've got three science content areas in the assessment, the Physical, Life, and Earth
and Space Sciences.
But cutting across those content areas are also four science practices.
In this case identifying science principles, which focuses on students' ability to recognize recall, define,
relate and represent basic science principles in each of the three content areas, using science principles,
using scientific inquiry, which is more designing, critiquing
and evaluating scientific investigations in experiments and identifying patterns in data.
And then using technological design, which focuses on the systematic process of applying scientific knowledge
and skills to propose or critique solutions to real-world problems.
And so technological design actually is present across items in all three of the content areas,
about 10% of the assessment was designed to assess technological design understanding.
But we don't report out on the four science practices.
We only report out on the subscale content areas.
However, if you go online and look at the questions tool, the NAEP questions tool on our website,
you can actually filter, or sort out, publicly based items by science practices,
so you can see examples of items that directly were designed to assess technological design.
Great.
Thank you so much.
Our next question comes from Susan Disch, no affiliation that was provided, and she would like to know,
Is there any data on specific science curricula used in the classrooms of those tested?
Jack, could you respond?
Well, yes in the sense that all the NAEP assessments include the background questionnaire survey items
that I discussed, which are given to students, teachers and school administrators.
And so the selection of these results, again, is available both in the report, and we showed some here today,
and also on The Nation's Report Card website.
So, for instance,
on the website you can find information about the amount of time students spend working collaboratively on science
activities as well as the amount of time they spend doing hands-on activities in school.
We've got charts and tables that show the percentages and average scores of students in the nation
and the states that participated in these various curricular activities and other contextual factors.
On the other hand, if you meant science curricula more narrowly in terms of specific commercial curricula, then no,
then actually NAEP is not designed to assess or evaluate specific commercial curricula.
Great.
Thank you.
Our next question is from Scott Forman.
He is President of Allegro Productions.
He asks, How can we get more corporations involved with STEM education in their local communities,
either through mentoring and scholarship programs
or the underwriting of curriculum enhancement materials that are provided to teachers free of cost?
Jeniffer, could you start us off with that one regarding how corporations can assist at the local level with STEM?
Absolutely.
I think there are many initiatives that currently exist for local outreach to occur with corporations in communities.
I think first and foremost you need to identify who the corporation is in your community
that has a technological opportunity to either work with students and/or engage
and show students some of the things that they're working on in their particular corporation.
But I think that it's also important to look for national programs that exist within corporations where they're
looking to do local outreach.
An example is for the Siemens Corporation, the Siemens Foundation, has a program called our Siemens Science Days.
And we actually use this as a local initiative nationwide and our employees,
who have varied backgrounds in the areas of STEM, go out and spend time with students,
but they also teach them lessons, that are aligned with national teaching standards for those grade levels.
So it's an enhancement to what the educators are doing in the classroom,
but it's also an opportunity for the kids to have a hands-on learning experience.
There are also many scholarship programs that exist in corporations because they see, of course,
these students as a pipeline to be able to come in
and provide innovative opportunities for the corporations to grow in the future.
And they're also very interested in making sure that educators have an opportunity to learn more
and engage students more in the area of STEM,
so programs like the Siemens STEM Academy exist so that teachers have an opportunity to have some hands-on educational
professional development experiences that will ultimately touch many students in the future.
Thank you so much, Jeniffer.
We actually received several questions from people, so I'm going to consolidate them all, if you'll allow me to,
regarding the next generation science standards and how it may
or may not be aligned with NAEP as well as how that will be rolled out in the common core through science
and the timeframe for that.
Sue Schroeder, a volunteer with Oshkosh School District, stated it as,
I understand that common core standard testing conducted by all states is scheduled for Spring of 2015 for math
and reading but not science.
With the adoption of the next generation science, what is the timeframe and rollout of science,
and what's the connection or alignment with NAEP?
Cornelia, can you start us with that one?
Well, I'll certainly tell you everything I know, which isn't much, so it shouldn't take very long.
But I know that there are science standards currently under development now,
so I know you're accurate in your statement that science will not be rolled out when reading and math are rolled out.
It's too early to know exactly how much like NAEP the assessments will be because we are just beginning to see some of
the proposed assessments and item specifications that are being developed in the consortia.
We do know, however, that they relied heavily, and reviewed the framework for reading
and math in the development of the language arts and the mathematic common core standards,
and I believe the same is true for science.
So I think our framework serves as a good model for them to use for the development of the science of standards
because ours were just most recently revised for the 2009 assessment.
So ours have been recently updated.
And I believe take us a little bit further into how to better asses science.
I just quickly jump in a little bit on the timeline.
I think the key here, of course, is that the common core state standards exist for reading
and math but are not complete for science.
But even after the standards for science, should they be developed and adopted by the states,
there's a long lead time between getting from those content standards to actually getting an assessment in the field.
So the department has been engaged funding the two Race to the Top Assessment Consortia, who,
only in their second year of work,
trying to get us to the stage where we can actually field these assessments in 2014-2015.
And I suspect that it won't take a full five years that we're seeing for reading and math in terms of science,
if only because a lot of the infrastructure for the next generation of assessment in terms of technology,
in terms of the statistical and psychometric models will already have been worked out after reading and math,
but it should still take several years even for the successful adoption of these standards to actually get us to an
assessment in the field.
So I wouldn't expect it necessarily any time soon.
Thank you, Jack.
Our next question is from Laura Atkins, a NAEP State Coordinator from the State of Tennessee.
She would like to know what percent of participating students responded to background questions about hands-on
projects?
Jack or Peggy, could you address that?
Yes, actually, in terms of the response rate for the background questions on hands-on tasks, it's 100%.
We had all the students who took the assessment also responded to those items.
Great.
Thank you so much.
Our next question is from Susan White, no affiliation provided.
She would like to know, she notes in the complete response for the pupae and larvae that Dr. Buckley presented,
the student calculated 120 minus 90 equals 40.
So she asks, Does that mean that you do not take arithmetic skills into account when scoring science?
Well, so the incorrect arithmetic in this case was not used to penalize the student.
The point here was trying to see whether or not the student could use data to draw an accurate conclusion.
That's the science skill under assessment.
But the ability to make the final correct calculation is important,
but it's actually irrelevant to that construct of using data to understand the scientific conclusion.
Great.
Thank you very much.
Our next question, and I apologize, I'll do the best with your name, Ayeola Fortune?
She would like to know, what are some best practices for improving the achievement of African-American, Latino
and low-income students in science?
Are there strategies beyond what has already been discussed today,
such as inquiry-based learning, working in teams, etc.
Hector, could you comment on that from the classroom view,
some of the best practices you know for narrowing achievement gaps, and then to follow up, Jeniffer,
if there's anything that you're seeing in your Siemens program that has proved effective.
Ah, yes, I can.
From my perspective, first, I'll just give you a little bit of my background.
I was born in Mexico.
I didn't learn English until I was in fourth grade.
And one of the biggest things I saw that helped me was that teachers were interested.
And that's one of the biggest things that I tried to do as a teacher was show a big interest in students, all students.
There are many students that are, they're kind of just out there, and lots of people don't speak with them.
And so it's really important to make not only eye contact, but actually, you know, a smile, visit with them,
talk to them, present them with ideas,
talk to their parents about opportunities that could be present if they'd just get more involved.
So, from my perspective, that's the real key is just showing an interest in the child.
This is Jeniffer.
The thing that I've seen as I work with the program for the Siemens Foundation is,
we make a concerted effort to do outreach that is specific to minorities and under-represented groups.
I think sometimes the digital gap provides a opportunity for these individuals to be missed because they aren't
necessarily receiving the information digitally as fast as they could, and the U.S.
mail system is also not always the best way.
We're still dealing with snail mail in some circumstances at these schools.
And reaching out to these educators.
So what I found is that if the educator can work with the students,
because most students at this point are very technologically savvy.
And we work with libraries throughout the country as well from an outreach perspective.
But to learn about programs that exist.
Like programs at Carnegie Mellon University that is specifically geared toward getting students engaged
and excited from a minority standpoint in science.
So there are many programs that are designed to specifically address those needs,
but there are also opportunities throughout the country that have special program components designed to address
minority students that may have an interest and may have the aptitude, or some that may not,
but they address all levels of students that may have an interest in doing more in the areas of STEM.
Thank you so much, both Hector and Jeniffer.
Jeniffer, while I have you, we have a question from Helena Easter,
and she would like to know how she can get her district involved with Siemens
and the wonderful opportunities they are providing for students.
Could you tell everyone how to get involved with what you spoke of earlier?
Absolutely.
We have several programs where educators can receive pre-professional development resources through our STEM Academy,
Siemens STEM Academy.
All this information is available on our website, and that's Siemens-foundation.org.
So there are paid summer opportunities to go to Discovery Education and spend a week long emersion program,
working with thought leaders and digital tools.
Educators are awarded that for professional development.
We work with the Department of Energy's Oakridge National Labs and Pacific North Labs to have teachers go there
and have an emergent hands-on research experience over the summer.
There are many best practice resources that are shared on our STEM Academy site.
But more than that, we have so many programs that are designed for student engagement and challenge-based learning,
from grades K through 12, and all of this information is free.
All of the programs are free.
Provide scholarships in some instances.
And, of course, free professional development for the educator.
So I encourage everyone, please, go to the website.
And there are also links to other programs from other organizations throughout the country.
Because the goal of the Siemens Foundation is to make an impact.
They're isn't a competitive scenario with other corporations to try to get their resources
to individuals throughout the country.
Our hope is that we can make a difference and be able to bridge the gap on STEM educational outreach.
Thank you so much.
Our next question come from Lindsay Lamb with Austin Independent School District, and she wants to know,
will there be a tutor release for 2000 NAEP Science?
Peggy, can you answer that?
Well, this particular time, 2011, we did not include the tutors in the assessment profile.
They will be included in 2015, when we assess science once again, and the comparisons, of course, will be to 2009,
so we don't have those there for you this time around.
Great.
Thank you.
Our next question is from William Bertrand, Technology Education Advisor with Pennsylvania Department of Ed,
and he asks, What is the progress of the Technology and Engineering Literacy NAEP?
Cornelia, can you start us off with that?
Well, I'll begin and then I'll ask Jack to follow along with that.
We are very excited about the Technology and Engineering Literacy assessment.
The Board has completed the development of the assessment framework and specifications
and has added it to the schedule for the 2014 assessment year.
And so we're excited to actually be filling out the STEM component, which we have Math and Science,
and this will add onto that.
So I'll let Jack respond more about, or Peggy, about where we are in the development process.
We are excited about opportunity to assess TEL,
and we are just finishing up with some small group field test opportunities to just protect our tasks.
We're getting ready to go into the field in January with a fairly large field test to again further refine our tasks
associated with TEL.
And then in 2014, of course, we'll be ready for our operational probe.
We call it a probe, however, it will be quite clinical in terms of participation of students.
We will not have state data,
but it is promised to be very rich in terms of what we'll be able to glean from not just scores in terms of percent
correct or scores on these particular talk or a scale, but also in terms of process,
we'll be able to look at the process of how students work through those activities.
It's going to be filled with interactive computer tasks as its primary method of assessment,
and I think that's one of the really exciting thing
and why what Peggy just described about having some interactions with small student tryouts
has been very important to us.
I will say that I think that this mode of opportunity would be exciting to administer at the state level,
and so we are anxiously awaiting the successful implementation of the common core assessments that would be online
in all states because I think this infrastructure at the state level has been one of the barriers for NAEP of
actually being able to provide access to the interactive computer tasks.
While I'm mentioning that, I will just say that one of the advancements in the 2009 framework,
which was used for this assessment in 2011, was the inclusion of interactive computer tasks,
and there will be a report coming up very shortly next month about the hands-on tasks components of NAEP in 2009
and the interactive computer tasks.
So I hope you're all looking forward to that release.
Thank you.
Our next question comes from J.T. Taylor, who is a post-doc researcher in Science
and Special Education at the University of Iowa.
J.T. would like to know how these findings might influence policy on science
and science-related areas as it relates to content standards at the national level.
Cornelia and Jack?
Well, I think that question is sort of similar to the content standards for common assessment work,
so I won't repeat all that was said there,
but I think it is important for the schools to know what are the expectations that states
have for student learning in science.
And so having common standards will be a new opportunity to have this conversation.
I'd just add that these content standards that the Governing Board developed for 2009 are really quite rigorous.
So we mentioned, not only the different subscales or content areas,
but also cutting across these other science practices.
And if you look at it across the sample items, you're going to see a wide range of challenging items.
And I think what these results today show us is that even under a very challenging assessment, that our eighth graders,
at least, can show gains right there.
So I would say in terms of how this can inform the policy debate on science standards,
that I think it could provide some credible evidence that those science standards should be rigorous
and that we should still expect American students to improve on them.
Thank you very much.
Our next question is from David Farbman.
He would like to know if there is any data available on how much time students spent in science class per week, and,
if so, are these correlated at all to the outcomes?
Jack?
Yes, actually.
So one of our background questionnaire items actually asks teachers, in a typical week,
how much time do they spend on science instruction.
When you break this down by percentage of students.
The (inaudible) response is 61% of the eighth graders spend between three
and five hours of instruction in science a week.
Actually about 5% in total spend less than that, so very few students spend sort of three or fewer hours.
But 26% spend between five and seven hours, and about 8% spend seven or more hours.
So that just tells you some idea about how much time, on average, did the eighth graders spend in science.
The second part of your question is actually really interesting in terms of, you know,
how does this correlate with outcomes?
So as you might expect, when you move from spending less than one hour, for example, up to spending between three
and five hours, where most of the students are, we see a statistically significant jump in scale scores,
so on average,
students who spent the least amount of time in science scored about a 146 in the eighth grade assessment.
But students who spent between the three to five hours a week, actually scored a 155.
What's sort of interesting about that, though, is when you go further,
when students spend even more time than three to five hours in science,
what we see in the assessment is actually that the scores on average decline.
So students who spend seven or more hours a week on science on average score about 147,
which is closer to the -- actually very similar --
statistically identical to the average scale score of students who spent less than one hour.
Now I would say, of course, just as I did when we discussed the other background items,
this is not evidence of causation here.
More science instruction past a certain point is not causing these students to under perform.
There could be issues of students being assigned to remedial
or additional science so on average they were maybe underperforming
and then they're getting more exposure to more science after that.
But it was an interesting finding that we saw, the data was here.
Thank you very much.
Our next question is from Eric Young, with the Colorado Department of Education,
who's trying to understand a bit about how teacher professional development was incorporated into the assessment,
and if there were any findings on this particular assessment regarding how teacher professional development might have
factored into results.
No, I mean in terms of -- we did talk to teachers about their preparation, their background,
their teacher training experiences.
We don't have a lot of background items that exclusively look at professional development offered.
So, for example, we can say things about whether
or not they had exposure during their training as a teacher to advanced science courses, so on the Arts
and Sciences content side, but also science education courses, and on average we see teachers who had either of those,
who had preparation in advance science coursework or science education coursework,
have students with average higher scale scores.
But in terms of actually professional development when they're in service, in this assessment, no,
we don't have a lot of background items there.
I'd like to just add to what Jack has said.
Many states have been in the process of changing their teacher certification requirements.
It used to be possible, and may still be possible in a large number of states,
to get a generic middle school certificate for teaching science, or math, or any subject for that matter,
and many states are revising their procedures to require more specific science teacher preparation.
So you might want to investigate that for your state to see where Colorado, that question was from Colorado, right?
To see where your state is on what are they requiring of teachers who are teaching middle school science.
Thank you very much.
Our next question is from Sue Schroeder, and she is curious about resources available to teachers,
ideally free resources, but I'm wondering, Hector,
if you could provide her with some information about what you use in the classroom
or what you know of that is available to science teachers as well as volunteers.
Interesting question.
In my case one of the things I did is I went through
and many different types of information to get information that I could use and to change.
But I would imagine that the biggest starting point would be just going to the NSTA resources
and looking over some of the textbooks that they have available, like on Inquiry, or on Best Teaching Practices.
I think that area provides a lot of very good information.
Other resources include some of the information from Corwin Press.
Get their manual that states what is available.
There are many, just a whole array of topics that they discuss,
that is aimed at professional development in terms of improving their own teaching practices.
Great.
Thank you.
Jeniffer, is there anything you'd like to add on resources available to teachers and volunteers in the classroom?
I think, again, I would lead you to the Siemens Foundation website, Siemens-foundation.org.
There are a lot of resources and links associated there.
But I think looking at corporate entities and organizations and foundations, if you go to many of those websites,
the ones that have an emphasis on STEM,
they're usually links and/or resources that will lead you to a lot of free materials for professional development for
educators.
Thank you, Jeniffer.
Our next question is from Lance Jones, with AFT.
He says, Given the change in scoring since 2009, is it at all possible to compare these to prior results.
Jack?
The short answer is no, and the longer answer, of course,
is that it's not just the change in scoring that matters here, it's really a new content framework.
All right, so,
the Governing Board spent a lot of time developing a new framework for the 2009 assessment that we used here;
and making a direct comparison, even though, for example, scale scores in both the prior framework
and the current one are on the 0 to 300 scale,
they're a comparison that's meaningless because you're scoring something completely different.
So we would strongly discourage people to make any comparisons back.
There were significant changes made in the assessment framework,
and in the assessment framework that's on the website, you can find on pages 15
and 16 a brief summary of those changes and that will, I think,
be convincing of why you shouldn't really compare the results of the two assessments.
Thank you very much.
Our next question comes from Anna Kuchment with Scientific American.
She'd like to know if we can comment on the relationship between standards and achievements.
Nebraska, Idaho, and Montana science standards earned Fs from the Thomas C. Fordham Foundation this year,
but they did better than average on this.
Is there any discernment or feedback that can be offered to her on how to make sense of that difference?
As you may know, every two years we release, for Reading and Mathematics,
NCS releases a report on trying to map the relative stringency of state standards onto the NAEP scale so that you can
compare states against each other.
Our methods are quite different than what Fordham uses.
Again, we look at the data from state assessments and from NAEP and try and do this statistically.
Fordham generally has assembled a panel of judges to try and rate the content of the standards as you -- in the past,
you know, they provided letter grades.
I'm not sure about the sort of -- I can't speak to the quality of how that works,
although I'm sure that Fordham does spend a great deal of time on that.
But we get this question all the time with reading a map, and I'll tell you, it's very difficult to answer.
I mean what you see is that states make changes to their standards for very different reasons,
and some states may have quite rigorous or stringent standards in some subjects because they're maybe, you know,
trying to pull their students ahead or setting a high bar for other reasons, but other states, you know,
have found by whatever chance that their standards may be too high or too low for what they're intending to instruct,
and then they'll actually shift them one way or the other.
And we see certain evidence of that shifting over time.
I should point out another report, though, on this subject that's a bit more technical
and follows more of our methodology.
The Advocacy Group changed the equation, the Science Advocacy Group actually
used a similar mapping methodology for the 2009 science results to what we do at NCS.
And they actually are able to use that to rank states against each other in terms of the relative proficiency of their
cut scores in science, on their state assessments for those states that have them.
And so I would strongly encourage you to take a look at that in addition to the Fordham report before maybe reaching
conclusions on how relatively stringent or high quality the different state standards are.
Yeah, I wanted to add to what Jack had said, too,
and just caution the questioner that just because the standards are high
or rigorous doesn't mean they're -- actually the content of them
is actually being implemented with fidelity in the classroom.
And the procedure that Jack is talking about looks at their proficiency expectation standard rather than the
substantive content of the standards and how it's being taught, so this is a complicated issue.
How students achieve will depend on what was their starting point, not just did they add on new standards,
so if the students were already achieving high, they may continue to achieve high.
So I think this is a very complex issue
and I'd just caution the listener to be careful in making these interpretations.
And I would add that for the 2011 data coming, this data that we're looking at today,
we're actually going to add mapping for -- mapping a report of the science data for Grade 8,
and so we're going to have a chance to examine not just reading and math standards,
the mapping of those standards using NAEP as a common index, but also the science.
So stay tuned for that report.
We anticipate within the next 12 months to have that available for the public.
Great.
Thank you, all.
Our next question is from William Bertrand.
He would like to know if there are plans on developing a STEM NAEP
or will there be a way of using the results from Math, Science and Technology
and Engineering to make a determination of how STEM is being taught and understood.
Cornelia?
He's really ahead of us and challenging us, isn't he?
I think it's very interesting.
We do feel good about having TEL being added to our complement of assessments in science and math,
but I don't think we yet know how successful TEL will be.
You heard Dr. Carr speak earlier about pilot testing going on next year and then a probe in 2014,
so I think we're not going to make promises that we can't keep at this time.
Yeah, I would just say, I don't see where a "STEM NAEP" would be necessary if we have coverage in mathematics
and science, and if we're successful in measuring technology and engineering literacy,
I think we will have covered the breadth of topics probably much more deeply than we could if we tried to assemble
them all in a single assessment.
Great.
Thank you.
And we, unfortunately, we have time for just one more question.
Again, as a reminder, if we were not able to respond during this event, we will do so via email.
And our final question comes from Sarah Pedemonte,
and she'd like to know if teachers described the nature of the hands-on activities that were used.
Jack, could you answer that for us?
Yes, actually, so we asked teachers to describe what extent did their students do hands-on tasks,
specifically either using chemicals, so like a chemistry task, using electricity, using living things,
so biology task, for example, like our mosquito example, or perhaps, you know, with frogs or formerly living things,
and then also in the earth and space sciences
whether or not they used hands-on tasks that involved rocks and minerals.
And so you can look online on the NAEP Data Explorer and actually disaggregate by those various tasks.
Oh, so just to give you a quick example, so with rocks and minerals.
Forty-three percent of eighth grade teachers said that they used a hands-on task during that year of instruction
involving rocks and minerals compared to 57% who did not.
Using another example, a thermometer, or a barometer.
Fifty-eight percent used a hands-on science task for instruction that used a thermometer or a barometer.
Great.
Thank you so much.
That concludes our Q&A.
Thank you, Amy, and thanks to all of you who engaged in our question and answer period today.
We appreciate all of your thought-provoking questions.
Before we end today's webinar, allow me to offer just a few other closing comments.
I would like you to visit the Governing Board site for the speakers' comments
and other additional science materials that we prepared for this release.
There are also links to The Nation's Report Card website that you heard Jack
and Peggy talk about where the NAEP data is stored and the release questions are found.
I mentioned earlier the release next month of the assessment and the reporting of the hands-on tasks
and interactive computer tasks.
This is going to be an online release, so I think you'll find that very interesting.
And we'll let you know more about that.
To really get up-to-date information you can follow us on Facebook and Twitter.
In closing I would like to thank Jack, Hector, and Jeniffer for being with us today and answering your questions
and providing their insightful comments.
And of course we thank all of you who participated.
I hope you have a great, sunny day today.
(end)
