Statistics don’t tell the story
CSAP results overemphasized in public's eye, administrators say
December 12, 2007
Craig — On a snowy December morning, Moffat County School District Superintendent Pete Bergmann and Assistant Superintendent Joel Sheridan bend over a desk, scribbling notes and numbers on a flow chart.
They’re trying to explain how the state interpreted the results of a standardized test – not an easy task, they say.
Colorado Student Assessment Program results are used in multiple school accountability programs, and school district administrators say they are useful tools in gauging the district’s progress. Still, they worry the standardized test results are overshadowing more comprehensive assessments.
They also are concerned the statistics aren’t telling the whole story.
The CSAP tests are the “common denominator” between several school accountability measures, including accreditation and Adequate Yearly Progress measurements, said Colorado Department of Education spokesman Mark Stevens.
The results are used to draw conclusions about a school’s performance and progress.
Recommended Stories For You
Sometimes those results support conflicting conclusions, he said.
The state may conclude a school is performing high on standardized assessments even if it failed to achieve AYP, a measurement used to assess if a school is on track to meet mandates issued by the No Child Left Behind Act.
CSAP testing was designed to provide a snapshot, not a detailed portrait, of a school’s academic progress.
Yet, that snapshot receives more attention than the more detailed assessments the district and its schools conduct, Sheridan said.
The district annually compiles a body of evidence, including other standardized test scores and teachers’ anecdotal records, to assess the performance of the district and individual students.
CSAP scores are included in this evidence but they are only a fraction of the data the school uses to make educational decisions. And while teachers take personal pride in seeing their students perform well on the tests, CSAP isn’t their main focus, Sheridan said.
Still, the test is a “high-profile indicator (that) dominates the perception of how schools are performing,” Bergmann said.
The state publicly releases CSAP results in the annual School Accountability Report Card, where it reports schools’ academic performance and growth.
The district focuses on producing academic growth in its student population, believing higher test scores will follow, Bergmann said.
Yet, the state’s conclusions on academic growth and performance can be misleading if the public doesn’t know how the state interprets the test results, he added.
To determine if a school is in academic growth or decline, the state compares students’ current test scores to their scores last year. Students who showed projected academic growth are then factored out of the equation.
If more of the remaining students scored better on the test that the previous year, the state concludes the school experienced academic growth. Conversely, if more students performed poorly on the test, the school is deemed to be on the decline.
According to this formula, the academic growth or decline of a school could rely on a few students, Bergmann said.
Assessing academic growth and decline doesn’t take into account an age group’s average test score. One grade might be considered in academic decline if more students performed worse on the test than the year before – even if its age group’s average test scores increased.
While CSAP tests are an important external assessment of a school’s performance, Sheridan doesn’t agree with how the results are interpreted.
“We always want these statistics,” he said. “People give us statistics, but they don’t tell the story.”