Naturally, I was interested in what we were getting dinged for, so I checked their rating system. Note that all of this is based on our accountability system, and specifically is based on our testing within our accountability system.
A state gets one star if districts get more credit for students achieving advanced.
A state gets one star if "progress of all students" counts.
Those two we get.
A state gets a star if "gifted students" are a subgroup.
Nope, we don't do that.
And a state gets a star if student growth counts for at least 50% of their district rating.
Nope. We use student growth in rating districts, but not for that much. In Massachusetts, student growth counts for 25% of accountability, and we've had to fight the ed reformers to get that far!
Cast your mind back to June 2014: the Board of Education adopted a new measurement for district accountability, which included that 25% of student growth. There had been pressure for as much as 30%. (Their rating system
And here's the real kicker on this: this is the system that determines the lowest 10% of districts, and, in term, determines where charter schools can really expand. Let's pause to realize that these numbers can thus be changed, and thus "lowest 10%" isn't at all a hard and fast set of districts; this is even demonstrated by our weighing growth differently for schools and for districts. Also, know that the big battle in Massachusetts behind the scenes in 2014 was the precise amount that student growth would count for. If student growth counts for "too much," Boston leaves the list of lowest performing districts.
And that would severely curtail the rampant charter growth in Boston.
It's one of the big secrets in Massachusetts public education: it's some of our urban districts that are getting the highest rates of student growth.
Be careful what you wish for, there, ed reformers!
h/t to Rob for the clarification!