Instructional Coaches=Magic Pill?

The News Journal has an article out today discussing the mid-year results of the Delaware Comprehensive Assessment System (DCAS). (Sidenote: My apologies if you don’t subscribe to The News Journal. I (obviously) won’t be reprinting whole articles here, but I will paraphrase key points.) DCAS is taken at three points during the year to measure a student’s progress on grade-level curriculum. I won’t debate the efficacy of this test as it relates to certain segments of our students; I’ll save that for another post.

I also won’t get into the results of the test, as that’s apparently why we have data coaches — to assist our teachers with data interpretation and tell us how to magically make those students’ scores increase, external influences be damned! I will get into a conclusion that writer Wade Malcolm — perhaps illogically — draws in relation to certain successes from the winter scores in one school district:

The use of instructional coaches — funded by the state’s Race to the Top initiative — is part of the reason Indian River School District has so far exceeded the state average in reading and math for every grade level tested.

Now, it’s wonderful that Indian River School District has exceeded the state average in reading and math. We should absolutely magnify success whenever it’s shown at any of our public schools, be they traditional, magnets, charters, or otherwise.

I’m just a little interested in how a District — or Mr. Malcolm, in this case — can draw the conclusion that “the use of instructional coaches…is part of the reason Indian River” saw such success. We educate in an almost myopically, single-focused environment of “Data! Data! Data!,” so I’m wondering if Indian River has ACTUAL data to prove that the instructional coaches were a part of that success.

Or is this just the District’s attempt to find some — any! — successes in the state’s Race to the Top program, which has proven to be not at all that popular among  many teachers across the state?