Tuesday, May 13, 2014

When are we going to give up on this silly idea?

Value-added measurements of teachers DON'T WORK.
In a study published Tuesday in the peer-reviewed journal Educational Evaluation and Policy Analysis, Morgan Polikoff, an assistant education professor at the University of Southern California in Los Angeles, and Andrew Porter, an education professor at the University of Pennsylvania in Philadelphia, found no association between value-added results and other widely accepted measures of teaching quality, such as the degree to which instruction is aligned with state standards or the contents of assessments. Nor did the study find associations between “multiple measure” ratings (which combine value-added measures with observations and other factors) and the amount and type of content covered in classrooms
In other words, the gain that children have on their standardized test scores over the course of the year has no association with the good things that are happening in the classroom.
But there's more: students are not randomly distributed within classrooms (as we all have been pointing out for years), so the group of students (and whatever their inclinations on testing) are 'chosen' by someone (usually the principal).
And, for the finish, my favorite study: teachers impact their students' heights just as much as they impact their students' test scores.

No comments: