Contact: Mike Russell 617-552-0889; Walt Haney 617-552-4199

LOW-TECH TESTS SHORTCHANGE HIGH-TECH STUDENTS Study Reveals Regular Computer Use Affects Student Testing

CHESTNUT HILL, MA (7-1-99) -- A new study confirms that writing tests administered via paper and pencil may significantly underestimate the capabilities of computer-savvy students, according to assessment specialists at the Center for the Study of Testing, Evaluation and Educational Policy (CSTEEP) at Boston College.

The study, published by the Education Policy Analysis Archives, compares groups of students taking open-ended (non-multiple choice) tests on paper to those taking the same tests on a computer.

Study results showed that, for students accustomed to writing on computer, responses composed on computer were substantially better than those written by hand. Specifically, for students who keyboard about 20 words per minute or more, performing open-ended language arts tests on paper significantly underestimates their level of achievement. However, for slower keyboarders, performing open-ended tests on computer adversely affects their performance.

These results confirmed those of an earlier CSTEEP study, in which the effects were so large that when tech-savvy students wrote on paper, only 30 percent performed at a "passing" level, but when they wrote on computers (without access to word processing tools such as spell check or grammar check), 67 percent "passed."

"The size of the effects was substantial," said CSTEEP Research Associate Mike Russell, author of both studies. "For the average student accustomed to working on computer, this difference could easily raise his or her score on the MCAS [Massachusetts Comprehensive Assessment System] test from the 'needs improvement' to the 'proficient' level."

Using items from the MCAS and the National Assessment of Educational Progress (NAEP), the most recent study focused on language arts, science and math tests administered to approximately 230 eighth grade students with different levels of computer skill at two Worcester, Mass. schools. In addition, information on these students' prior computer use and keyboarding speed was collected. The earlier, study had measured approximately 100 students participating in a computer-intensive project at the Advanced Learning Laboratory (ALL School) in Worcester (which also participated in the second study) and was published by Education Policy Analysis Archives in 1997.

The results of both studies suggest that the mode of test administration may have a great effect on students' performance on open-ended items, findings which the researchers believe have significant implications for two critical areas: the increasing integration of technology into student learning and overall education assessment efforts.

Nearly 10 million students nationwide take some form of written state test each year. The researchers believe these study findings indicate that state paper-and-pencil tests may be underestimating the abilities of two or three million of these students annually. And the gap may be widening, they add: According to a 1998 national survey, 50 percent of K-12 teachers have students use word processors, and 29 percent have students use the World Wide Web. These numbers are expected to grow annually.

And yet, noted the researches, students, teachers and schools are increasingly held accountable for student learning as gauged by handwritten test results--which has pressured some school administrations into rather extreme measures. At the ALL School, the results prompted the school to increase the amount of time students spent writing on paper and decrease students' time using computers. According to the researchers, that action is comparable to asking modern day mathematicians to abandon calculators for slide rules so that they can perform better on tests that only allow slide rules.

"It's an understandable, but unfortunate, reaction to some important findings," said Professor Walt Haney of CSTEEP. "But, we need to ask, 'What's more important here, that students use traditional writing methods, or that tests measure their abilities regardless of the method the student prefers for writing?'"

Russell notes that there are several options available to schools to improve the situation. "The most logical solution in the short term is to simply recognize that there is a problem," he says, "and that scores from high-stakes state tests are not necessarily a good measure of a student's ability."

Adds Haney, "It's important to take other measures into consideration, such as transcripts and portfolio assessments."

A national center of advocacy in standardized testing, the Center for the Study of Testing, Evaluation and Educational Policy (CSTEEP) at Boston College was established in 1980 with the goal of shaping public policy to promote equity in testing and improve the quality and fairness of education. CSTEEP is involved in research into comparative international achievement, middle school reform, and the policy implications of national assessments.

MEDIA NOTE: For more information, contact CSTEEP Research Associate Mike Russell at 617-552-0889 or 781-237-9417; or Professor Walt Haney at 617-552-4199. Voice-mail is available, and your call will be returned as soon as possible. In addition, a complete copy of the study can be found at http://epaa.asu.edu/epaa/v7n20/.

###

MEDIA CONTACT
Register for reporter access to contact details