[Infowarrior] - U.S. Students Know What, But Not Why

Richard Forno rforno at infowarrior.org
Tue Jun 19 14:35:24 CDT 2012


U.S. Students Know What, But Not Why

by Cathy Tran on 19 June 2012, 11:35 AM | 2 Comments

http://news.sciencemag.org/scienceinsider/2012/06/us-students-know-what-but-not-wh.html
 
The first-ever use of interactive computer tasks on a national science assessment suggests that most U.S. students struggle with the reasoning skills needed to investigate multiple variables, make strategic decisions, and explain experimental results.

Paper-and-pencil exams measure how well students can critique and analyze studies. But interactive tasks also require students to design investigations and test assumptions by conducting an experiment, analyzing results, and making tweaks for a new experiment. Those real-world skills were  measured for the first time on the science component of the National Assessment of Educational Progress (NAEP) that was given in 2009 to a representative sample of students in grades four, eight, and 12.

"Before this, we've never been able to know if students really could do this or not," says Alan Friedman, a member of National Assessment Governing Board, which sets policy for NAEP. The overall scores on the 2009 science test were released in January 2011, and today's announcement focuses on the results from the portion of the test involving interactive computer tasks.

What the vast majority of students can do, the data show, is make straightforward analyses. More than three-quarters of fourth grade students, for example, could determine which plants were sun-loving and which preferred the shade when using a simulated greenhouse to determine the ideal amount of sunlight for the growth of mystery plants. When asked about the ideal fertilizer levels for plant growth, however, only one-third of the students were able to perform the required experiment, which featured nine possible fertilizer levels and only six trays. Fewer than half the students were able to use supporting evidence to write an accurate explanation of the results. Similar patterns emerged for students in grades 8 and 12.

"We've got our work cut out for us," says Friedman, who is also a consultant in museum development and science communication.

The computer simulations offer NAEP a much better way to measure skills used by real scientists than do multiple-choice questions, says Chris Dede, a professor at Harvard Graduate School of Education. "Scientists don't see the right answer. They see confusing situations and use methods like inquiry to get meaning from complexity. Science is a domain where paper and pencil is a poor match."

The more the test matches the domain, Dede adds, the less problematic teaching to the test becomes. Interactive computer tasks also allow examiners to speed up processes and eliminate safety concerns raised by having students perform actual hands-on tasks.

Computer simulations will continue to evolve at NAEP, which likes to call itself the nation's report card. Friedman says that so-called embedded assessments—which can provide the ability to track when students make a mistake and what they do to correct it—would be "dynamite information" to have. Keystroke data, for instance, have the potential to provide insight about the reasoning skills that students use to solve problems.

"It may give us a way to reward students who don't necessarily jump to the answer right away but show a deliberate process to get to the answer," says Friedman. It could also identify those students who have learned material without really understanding it. "There is no way to memorize for this test,"     says Friedman. "You really have to think on your feet."


---
Just because i'm near the punchbowl doesn't mean I'm also drinking from it.



More information about the Infowarrior mailing list