A researcher at Rice University urges colleges to measure students’ writing skills to see if they improve in their four years of college. The researcher, James Pomerantz, a professor of psychology and a co-author of the writing skills study, says that college-ranking websites should be able to provide information on how well colleges help students improve their skills, such as writing.
“Colleges and universities seldom perform such before-and-after comparisons to see how much—or whether—students improve over their college years,” said Pomerantz in a statement announcing the study results. “If you scour the web looking for information about how well students progress while pursuing degrees at America’s colleges, you will be hard-pressed to find a single school that provides this information.”
A 7% Improvement
Yet the idea seems plausible. Why shouldn’t families have a sense of how well students progress at a particular college, especially in areas that employer’s value, such as writing, analytic ability, and quantitative reasoning? I spoke by e-mail with Pomerantz, who after testing students at Rice determined that their writing skills had improved by 7%, to learn more.
He told me he wasn’t sure 7% was significant—that the real usefulness of the tests will emerge in comparing results with those of other schools—a 7% improvement would be better than a 5% improvement at another school, but not as good as an improvement of 9% at still another.
Although Pomerantz mentioned measuring other important skills such as critical thinking and quantitative reasoning, tests to develop these skills haven’t been developed yet.
“We’ve not gotten to quantitative reasoning or critical thinking yet. Our initial goal here was just a proof of concept, that a fundamental skill such as writing ability could be assessed and tracked. It will take some time to develop suitable tests for these other skills.”
I also asked Pomerantz if it makes sense to test students on the knowledge they acquire in college. You’ve probably seen alarmist headlines that say things like, ‘Americans don’t know the three branches of government,’ or, ‘Could you pass this eighth-grade test from 1860’—and the questions seem to be taken from the GRE.
But Pomerantz doesn’t recommend testing to see if students have acquired general knowledge in their four years in college.
“It’s almost a certainty that college students majoring in, say, electrical engineering know more about that subject when they graduate than when they first enrolled as freshmen. But they may know a lot less about other subjects that they have not studied, where they have become rusty. People often assume that skills must improve over students’ college years because of all they are learning. But as they are learning new things they are forgetting old things. It’s a race between acquisition and loss—it’s like trying to fill a leaky bucket!”