A new report measuring fourth, eighth, and twelfth graders’ abilities to apply their science knowledge in lab experiments finds that most students can select correct conclusions from a scientific investigation, but they face difficulty when asked to select correct conclusions and explain their results.
“That tells us that our science teaching isn’t getting us as far as we need to go,” Chris Dede, professor from Harvard Graduate School of Education, told the Associated Press.
The report, Science in Action: Hands-On and Interactive Computer Tasks from the 2009 Science Assessment, contains data from (1) hands-on tasks, and (2) interactive computer tasks (ICTs) that were administered as part of the 2009 National Assessment of Educational Progress (NAEP). These tasks ask students to work in a laboratory setting—either real or simulated—to answer open-ended scenarios that require a deeper level of planning, analysis, and synthesis. The tasks measure how well students can combine their knowledge with real-world investigative skills.
Hands-on tasks are forty minutes long and require students to use materials and laboratory equipment to perform actual science experiments. ICTs are either twenty or forty minutes long and require students to solve scientific problems in a computer-based environment often by simulating a natural or laboratory setting. Both types of tasks measure how well students can predict what might happen in a real-world science situation, conduct an investigation and observe what happens, and explain the observations by interpreting data or drawing conclusions.
Although NAEP tests have included hands-on tasks since 1996, the 2009 administration of the test represents the first time computers were used. Alan Friedman, chairman of the National Assessment Governing Board’s Assessment Development Committee, told the Associated Press that the ICTs “went beyond what had previously been measured, testing how students ran their own experiments in simulated natural or laboratory environments with the ability to go back, adjust variables, and correct their mistakes on a computer.”
Friedman said the computer tests are “dramatically more expensive” to design but added that traditional assessments cannot measure the same skills. “This is a set of skills, which, in the real world, is invaluable and which, before this, we’d never been able to know if students could do this or not,” Friedman said.
At the twelfth-grade level, for example, one ICT required students to investigate the role of phytoplankton—microscopic, plant-like organisms that live near the ocean surface—in the earth’s carbon cycle. Students also had to analyze an authentic set of experimental data relating levels of iron and nutrients to the growth of phytoplankton, and use a resource library to research ocean locations where increased iron levels might affect phytoplankton growth. On this computer assessment, twelfth graders correctly answered 27 percent of questions, on average.
For the non-computer–based, hands-on tasks, twelfth-grade students were asked to determine which of two sites would be the better location for building a new town based on which site might have the better water quality. To complete the task, students test water samples from both sites to determine whether they meet federal standards for various pollutants and then provide a final recommendation based on their results. A video demonstrating how students complete the task is available to the right.
The report broke the task down into several different steps and measured student performance on each. For step one, predict, 64 percent of students explained their preliminary recommendations with valid support based on materials in the kits. In step two, observe, 75 percent of twelfth graders could perform a straightforward investigation to test the water samples and accurately tabulate data, but only 11 percent were able to provide a valid final recommendation by supporting their conclusions with details from the data for step three, explain. In total, twelfth-graders answered an average of 40 percent of questions correctly on hands-on tasks.
When asked to complete further steps to extend their inquiry, only 14 percent of students could correctly evaluate water treatment steps needed to remove pollutants that exceed national drinking water standards for step four; only 28 percent could describe the scientific processes used to remove water pollutants for step five, according to the report.
Katherine Carroll, an eleventh- and twelfth-grade chemistry teacher in Waterboro, Maine, told the Associated Press that even her best students struggle to explain their conclusions in their lab reports. “Teachers have moved toward teaching more knowledge, as opposed to the understanding behind that knowledge,” Carroll said.
As shown in the table to the right, females scored higher than their male counterparts on the hands-on tasks, even though males scored higher on the traditional paper-and-pencil science assessment. The report also identifies significant gaps between the scores of white and Asian/Pacific Islander students and their black and Hispanic classmates.
The report also includes a survey asking students questions about their experiences in science class. It finds that 57 percent of eighth graders had teachers who reported at least a moderate emphasis on developing scientific writing skills. However, only 28 percent of twelfth graders said they wrote a report on a science project at least once a week. Additionally, only 51 percent of twelfth-grade students said they designed a science experiment at least once every few weeks. The report also finds that only 53 percent of twelfth graders reported that they were currently taking a science course.
Quoting Peggy Carr, associate commissioner at the National Center for Education Statistics, the Associated Press reports that although the tests “raised significant questions about students’ abilities to apply scientific knowledge to the real world,” students did seem to enjoy taking the tests. The article said Carr usually observes students losing interest in the traditional NAEP tests. “Not so with these assessments,” Carr said.
The full report is available at http://nces.ed.gov/nationsreportcard/pdf/main2009/2012468.pdf.