Thursday, May 12, 2016

DESE Accountability survey and what it doesn't ask

This article from EdWeek on how Colorado is handling their state responsibility to reimagine their state accountability system under the new federal education law caught my eye yesterday. Colorado officials are holding town hall meetings across the state asking "What do we need to fix?" And they're getting an earful:
"I see this as a golden opportunity to rethink our entire accountability system," she said. Speaking from a torn-out sheet of notebook paper, Jones cited a long list of complaints: The department has become too heavy-handed. Its labeling of schools is demoralizing. Its standards are inconsistent.

Today brings a similar article from Florida, which will begin with an online forum, and go from there.

In Massachusetts, the feedback method that DESE has chosen, at least as an opener, is this survey, which I took this morning.
Let me start by saying that the first option the survey gives you is to upload a letter to them, in lieu of or in addition to taking the survey. I'd highly recommend you take them up on it.

The survey starts with a series of ranges, in which you are to mark which end you are closest to along a line. The ends are:

  • simplicity v robustness in what is measured
  • comparability v rich detail in what is measured
  • inputs v outputs in classrooms
  • achievement v growth in results
  • comparison v standard in measuring districts
  • lowest performance of school in district v overall performance of schools in district being district measure
Now if you spend a lot of time at state accountability meetings, all of the above is pretty common terminology. But for most of the rest of the population, the above ranges are pretty meaningless, even with the attempts at explanations given with each question. 
This isn't how people--regular people--talk about education. 
And having this be the survey is extremely off-putting.

These, and the questions that follow about things like achievement and growth, identifying schools by performance levels and the like all assume one thing: we're going to measure schools by some very standard, graphable way.

But is that really the way people--parents, kids, neighbors--talk about their schools? 
Yes, MCAS scores come up, but there are a whole list of things parents (in particular) want to know about their schools: how clean is it? do they have enough supplies? how long is recess? do the kids get art/gym/music every week? for how long? what happens if a kid misbehaves in class? how is special education handled? how are students with ELL needs taken care of? are kids treated fairly by teachers? who are the kids that go to this school? are the elementary kids really getting social studies? what kinds of classes can the high school students take? do they have access to AP/dual enrollment/IB/etc/etc classes? how often is the bus late? do we need to pay for sports? what kinds of colleges do kids go to after this school? what kinds of things do the kids who don't go to college do? are they ready for it? do graduates stay in town or move elsewhere?

...the list is exhaustive.

MCAS touches literally none of the above.
And the survey touches none of the above.

We have a chance here, Massachusetts, to have a REAL conversation about how our schools are doing and how we measure it.
So far, we're missing the chance.

UPDATE: This is also something that appears to be concerning Congressional Democrats:
"Unfortunately, as states embark on plan development, there are early reports of systemic barriers impeding the participation of teachers, paraprofessionals, specialized instructional support personnel, parents, and other stakeholders in state and local plan development," Murray and Scott wrote to King. "For example, lack of consideration for working parents and community members in scheduling meetings with stakeholders or the inability of teachers, paraprofessionals, and other school personnel to secure release time to enable full participation in plan development." 

No comments: