Friday, April 25, 2014

The insanity in push on evaluating teacher training programs

A telling couple of paragraphs from today's Politico article:
A recent study funded by the Education Department found that value-added measures may fluctuate significantly due to factors beyond the teachers’ control, including random events such as a dog barking loudly outside a classroom window, distracting students during their standardized test. A 2010 study, also funded by the Education Department, found the models misidentify as many as 50 percent of teachers — pegging them as average when they’re actually better or worse than their peers, or singling them out for praise or condemnation when they’re actually average.
Yet another challenge: Calculating scores for educators who do not teach subjects or grades assessed with standardized exams. Nationally, some 70 percent of teachers — including most high school and early elementary teachers, plus art, music and physical education teachers — fall into that category.
Despite such complications, Muñoz made clear in a call with reporters on Thursday that Obama wants student test scores, or other measures of student growth, to figure heavily into states’ evaluations of teacher prep programs.
“This is something the president has a real sense of urgency about,” she said. “What happens in the classroom matters. It doesn’t just matter — it’s the whole ballgame.” So using student outcomes to evaluate teacher preparation programs “is really fundamental to making sure we’re successful,” Muñoz said. “We believe that’s a concept … whose time has come.”
Thus:

  • it's junk science
  • the Administration knows it's junk science, because one of its own departments is saying so.
  • we're going to do it anyway because it's "time has come."

This. Is. Insane.
(h/t to Perdido Street School for the link)

No comments: