What do they mean?

Once again, school assessment scores have been released, with the usual mixed bag of results that leave taxpayers, parents, students, and those intrigued by such information puzzling over what it all means.

There are encouraging bits of news to go along with the more problematic. On the plus side, we see The Dorset School posting strong results in fifth grade and eighth grade writing. Arlington Memorial Middle School turned in great results in reading. Over at Burr and Burton Academy, 61 percent of 11th graders scored proficient in writing, substantially ahead of the statewide average as a whole, which is a pathetic-looking 43 percent. Of course, that begs the question of what is considered "proficient." It could mean almost anything, and coming up with reasonable, yet challenging, standards for what is proficient or acceptably good enough is one of the biggest hurdles to school assessment testing.

You can cherry pick around the massive report released last week by the state's Department of Education and find examples of half-empty glasses too. Those interested should visit education.vermont.gov, where the data is sliced and diced in almost every way imaginable; by income, gender, racial background and other categories and sub-categories.

So what does it all mean?

We'd like to think it indicates that slowly but surely, we are making some headway towards what Gov. Peter Shumlin has correctly identified as the major issue confronting the state's long-term economic competitiveness, its job-creating capacity and its quality of life - the ability of the state's education system to turn out a high number of highly qualified secondary school graduates ready to go to some level of post-graduate study, either in a traditional four-year college setting or something else. Over the long term, nothing has a greater impact on both the ability of people to have not only a satisfying life but on their earning potential. While there are exceptions to everything, the big picture is clear. In a globalized, hyper-connected world, those who bring not only a strong base core of knowledge to life, but more importantly, a restless curiosity about learning new things and mastering new skills, will have the best chances at success in life, as they choose to define that.

But to be honest, we're not sure that hunch is money in the bank. How valid is the data, really? We'll assume that since we're not seeing a sharp upward tilt of students passing these tests with "proficiency" or better, that the tests aren't being "dumbed down" to produce feel-good results that mask other deficiencies. Better indicators of performance might be how well Vermont does on national education assessments like the NAEP test (National Association of Education Progress) - which is pretty darn good - or college acceptance rates and drop-out rates (again, substantially better than the national average, particularly in the latter category).

The NECAP tests (short for New England Common Assessment Program), which Vermont uses along with New Hampshire, Rhode Island and Maine to comply with the requirements of the much-maligned and controversial - if very well-intentioned - No Child Left Behind Act, are going away after next year. They'll be replaced in 2015 by a new set of tests known as the Smarter Balanced Assessment, aligned to a U.S. education initiative known as the Common Core. Forty-four states have so far signed up for it, which will give us a better sense of how Vermont's students are doing compared to the rest of the nation. The larger question might be how well are U.S. students doing compared to the rest of the world, a subject of much angst and teeth grinding in recent years, as students in such disparate places as Finland, South Korea and Singapore have posted test scores far beyond those of their American peers.

We hope the Smarter Balanced Assessment lives up to its name, but it's arrival prompts a thought - what should these assessment tests be telling us, testing for, and how should they be structured?

We're journalists, not educational experts, but it seems that first of all, these tests should have some direct bearing on and relevance to students. Educators, correct us if we're wrong, but our understanding is that right now, a student's performance on the NECAPs has no impact on a student's grade-point average, class rank or long-term school record. Maybe it shouldn't. But if students had some skinny in the game, beyond personal and school pride, it might be taken more seriously, particularly at the high school level, when students tend to have a clearer view of such rewards and consequences, than, let's say, your average third grader.

Secondly, these tests should not pose a straitjacket to individual teacher's creativity and motivation to teach. In short, they shouldn't have to "teach to the test." If a given teacher has a strong background in European history from the French Revolution to World War I, he or she should be able to skew their teaching to reflect that interest, since their enthusiasm for learning is likely to rub off on their students.

Thirdly, the assessments should test for the ability to acquire new knowledge.

The consequences are too large to stop tinkering with until we get this right.

There are many models to successful educational strategies, and the penalties for failure are high, getting higher, but slow to emerge. They go beyond the school yard as well. As one former Secretary of State once wrote "It Takes a Village." Hillary Clinton was surely right about that. School results aren't just a reflection of individual schools - they are, to a degree, a report card on the entire community.


If you'd like to leave a comment (or a tip or a question) about this story with the editors, please email us. We also welcome letters to the editor for publication; you can do that by filling out our letters form and submitting it to the newsroom.

Powered by Creative Circle Media Solutions