Fair enough: I knew it was coming. I've been warning for months (is it years?) that when the new accountability results were released, there was going to be a real temptation to jump straight to MCAS results and entirely miss a lot of other important information that was also included.
And sure enough, here we are, with MCAS ranking lists and headlines that enshrine single digit changes in results.
And sure enough, here we are, with MCAS ranking lists and headlines that enshrine single digit changes in results.
To be fair, there have been exceptions: despite the alarmist headline, Kat McKiernan at the Boston Herald covered some important information, and Aimee Chiavaroli at South Coast Today got probably the best headline I saw as well as kept the focus on accountability results throughout. This MassLive follow up from Shira Shoenberg took the time to understand the shifts, as well as give a round up of quotes.
And while it is always easy to rant about press coverage, they were hardly alone. If you've dismissed this system as not a change, you not only missed important information; you dismissed a lot of important work the schools are doing, that was being recognized, in the process.
The loud complaint remains "this is majority testing." Yes. And that's federally required.
The other complaint is "there's still a rank ordering system." Yes. And that's also federally required (and if your argument is they don't need to release the entire list, please realize that it would be a public document required to be released upon request).
There also has been the "concern" that moving away from straight achievement as the determiner makes this an illegitimate process. If that's your argument, I don't know what you're doing near education.
My feeling on this--and yes, it's my feeling on this--is that we can and absolutely should have conversations about what schools should be and how we know that. This release should be part of this; the report cards coming in winter should be part of this; the work like that of MCIEA should be part of this;frankly, every district should be having conversations around how they share what they're about, and how they talk about what they're good at and what they're working on. But any of the reactions that just dismiss without engaging don't actually further process. We live with ESSA as the federal law; we should ask not only what are we doing but also what can we do as a path forward.
My feeling on this--and yes, it's my feeling on this--is that we can and absolutely should have conversations about what schools should be and how we know that. This release should be part of this; the report cards coming in winter should be part of this; the work like that of MCIEA should be part of this;frankly, every district should be having conversations around how they share what they're about, and how they talk about what they're good at and what they're working on. But any of the reactions that just dismiss without engaging don't actually further process. We live with ESSA as the federal law; we should ask not only what are we doing but also what can we do as a path forward.
So where does that leave us?
You can find the presentation (for download) on the changed system on the MASC website. I'll drop a few slides in here for clarity, but it's useful to have at hand when trying to decipher your own district or school results.
First, note that there are really two measures for each school and district:
You can find the presentation (for download) on the changed system on the MASC website. I'll drop a few slides in here for clarity, but it's useful to have at hand when trying to decipher your own district or school results.
First, note that there are really two measures for each school and district:
- The normative measure, which essentially is "how did you do against everybody else?" This puts it closest to what we might think of as the former measure, particularly in that it determines that "lowest performing" category, as required under ESSA.
- What they're calling the "criteron-referenced" measure, which is measuring progress made by your district or school in each category. Part of what's novel in this section is you're not only measured by the whole, but half of the credit on this comes from what's going on with the lowest performers. This part gets measured by how you do against targets: you can exceed them, meet them, increase but not meet them, not change, and decrease (this is measured by 4 through 0). That measure happens both for the whole student population being evaluated and for the lowest performing group.
In both cases, districts and schools are being evaluated on more than MCAS scores, and certainly more than MCAS achievement (thus my frustration over the Globe and others printing rank ordered MCAS scores as if they're worthy evaluations, as many are seeking such simplistic validation, in any case).
The normative measure looks like this, first for the schools that don't have high school graduation, then for the schools that do:
The normative measure looks like this, first for the schools that don't have high school graduation, then for the schools that do:
Now, this is still majority MCAS, and there are extensive notes on this blog from Board of Ed meetings as to why that is the case (a lot of it is ESSA). The weight of growth also hasn't changed, and that was also heavily discussed at Board of Ed meetings. But we also (as required) have English learners represented here for those who have them (entirely by growth) and chronic absenteeism weighs in, too, in both sections.
And check out what's happening with the high school: the additional indicator adds completion of advanced coursework, which is important in conversations around equity. Further, the high school graduation rate--something that was already being looked at--now also includes an extended engagement rate, which is designed to capture the kids who come back or who hang on after their fourth year. That is hard work (and something some districts really have been working hard at) that matters.
That information, crunched, is what gives for the most part the flagging as in need of various kinds of assistance. There has to be a declaration of some kind (ESSA again), and this is what is being used for that.
Do note what happens if a district or school doesn't have enough English learner students to be measured: it bumps up the importance of achievement and growth in MCAS. My sense is that these are also districts and schools that would, for the most part, being doing well on the achievement within MCAS in any case, and this gives an extra boost to that portion of their evaluation.
As much as this piece gets emphasized, I'd argue that the other part is much more interesting, since it looks at how you did based on where you have been rather than compared to everyone else.
And so the critereon referenced piece:
That information, crunched, is what gives for the most part the flagging as in need of various kinds of assistance. There has to be a declaration of some kind (ESSA again), and this is what is being used for that.
Do note what happens if a district or school doesn't have enough English learner students to be measured: it bumps up the importance of achievement and growth in MCAS. My sense is that these are also districts and schools that would, for the most part, being doing well on the achievement within MCAS in any case, and this gives an extra boost to that portion of their evaluation.
As much as this piece gets emphasized, I'd argue that the other part is much more interesting, since it looks at how you did based on where you have been rather than compared to everyone else.
And so the critereon referenced piece:
It's the same measures as above in the normative section (broken apart), and every one is looked at twice: once for the whole student population involved, and once for the lowest performing students. Each side counts for half.
And this is about meeting your targets (not about how you did compared to the guy next door), so there are the five options I mentioned above: you can exceed them, meet them, increase but not meet them, not change, and decrease relative to them, thus giving a score of zero to 4 for each one.
This section is much more interesting, both because it doesn't simply make this a silly race, but also because it has that focus on the struggling kids. Most schools have had some sort of focus on the test scores of struggling kids, but do they know which kids are most struggling with absences? or who isn't taking advanced coursework? or what their work with kids after 4 years of high school looks like? This is the time to ask.
Note that if you pull these results up from the DESE District profiles, you can click on the tab that has more details. I would urge you to do so (and more from me to come on that on Worcester).
But where are my levels, some ask.
We don't have them anymore! And those who are trying to make the assistance tagging into that are missing a crucial point, which hasn't been emphasized nearly enough to the field: some of this is going to be about having a plan forward.
Here's where those conversations come in, though:
Note that the focused/targeted support section is from A) low graduation rate (this is another ESSA requirement) and B) low test participation. While I don't know what the message to the superintendents has been, the message to the Board of Ed has been about plans; the question to districts is more about noting and responding than anything else. There has even been recognition that alternative schools are in many cases going to live in this section, due to their graduation rate, and that'll require flagging but is understood.
The broad/comprehensive support is where the receivership schools and districts are, plus the districts that hadn't been released from Level 4. Those have plans and are continuing with them. Though the determination of their getting out is still going to be based largely on the normative ranking (at the Commissioner's discretion), that now includes more than it did.
Most districts and schools are living in the "partially meeting" category and that is what I think is most interesting and what I'd urge people to explore. Which categories aren't being met? For whom? And what is being done about that?
More to come looking at particularly schools and districts, but for now, do as Mr. Rogers urges.
No comments:
Post a Comment