Friday, January 28, 2011

Time to start paying attention to value-added assessment

(if you weren't already)
One of the conditions to accepting money from the Race to the Top competition is a shift to evaluating teachers in some part on so-called "value-added assessment." This boils down to "how much did a kid's test scores go up while the student had that teacher?"
Sounds pretty straightforward, right?
It is until you actually see how it works out. The Gates Foundation last month released a study touting how results from two tests corralated with one another (suggesting they worked), but a review of that study argues that it's not nearly so neat as this might suggest. The Answer Sheet has a good summary of all of this, but it boils down to this:

“In other words...teacher evaluations based on observed state test outcomes are only slightly better than coin tosses at identifying teachers whose students perform unusually well or badly on assessments of conceptual understanding. This result, underplayed in the MET report, reinforces a number of serious concerns that have been raised about the use of VAMs for teacher evaluations.”


I'd like something a bit more effective than a coin toss deciding if my children's teachers are doing a good job.

No comments:

Post a Comment

Note that comments on this blog are moderated.