Oregon school districts on Thursday released the results of these so-called Smarter Balanced assessments taken by students in grades three through eight and in high schools.
The tests are intended to measure how the state's students are doing with the Common Core curriculum standards that Oregon adopted in 2010. One of the advantages of using these Common Core standards is that it allows for better comparisons from state to state. Those comparisons can be useful: If Oregon students are lagging in one particular area, state leaders can look to see what's working in another state that might be doing better in that area.
But in terms of guidance from year to year about whether Oregon schools are improving, the Smarter Balanced assessment results offer just one piece of the puzzle; it would be a mistake to read too much into them, and smart school administrators will be careful about trumpeting big advances in the scores for fear that the numbers slide back the very next year. (A footnote: The state still uses its Oregon Assessment of Knowledge and Skills test to measure student's proficiency in science; since the Common Core tests are more difficult than the previous state tests, this could explain why most of Linn County's schools did better in the science portion than in the mathematics and English language tests.)
With all that said, this year's batch of numbers is, in fact, a little underwhelming: The percentage of students who passed the tests slid back in most Linn County schools and across the state.
There are bright spots, to be sure: Schools in Scio, for example, easily surpassed the state average in the science test. It turns out that's because Scio teachers and administrators have made this area a priority and are doing some smart things in that area. We'd bet other school districts will be checking out Scio's science efforts.
In fact, that opportunity to share what's working with other schools remains one of the best things about these tests.
But the fact of the matter is that plenty of other metrics help to drive these overall test results. For example, parents and guardians might want to take note of a school's attendance rate: It stands to reason that students who attend school more frequently will do better than students who are chronically absent. (It also turns out that this common-sense finding is backed by considerable research.)
A high school's graduation rate can be a rough guide to that school's ability to identify students who might be at risk of falling through the cracks; schools are spending extra time these days trying to do a better job of shepherding those students through their rough patches. Much more work remains ahead on that front.
And if you want a sneak peek about how high school students will be performing on these tests in about a decade, take a closer look at how third-graders are performing on the English language tests: Research suggests that students who are reading at grade level by third grade greatly improve their chances of succeeding at school. Early intervention at that level could result in improved test scores and graduation rates down the line. (That early invention turns out to be a lot less expensive, too.)
All of which is to suggest that it would be a mistake to read too much into this year's batch of test scores, although the numbers are interesting and, in some ways, useful. Instead, the best way to think of them is as conversation starters, as jumping-off points for a broader examination of school performance by parents, students, teachers, administrators, taxpayers and elected leaders. But don't fall into the trap of putting too much weight on these scores: They don't tell the entire picture, and they never will. (mm)