I am of course posting this after the Worcester School Committee meeting at which this was discussed. This is partly because this isn't intended as anything other than my thoughts, but it's also rather been a bleak week. It's a hard week to pull together the through line to write.
I haven't been paying a lot of attention to Worcester School Committee agendas in recent months, but the T&G headline "Worcester schools set goals for this school year. How are they doing?" caught my eye earlier this week.
Contrary to the impression left by the article, the goals being discussed are not ones set by the superintendent for the school district, but are the goals set by the School Committee for the superintendent. In taking a quick skim through them, though, I can understand why this was misunderstood, as there doesn't seem to be much here that is about the superintendent directly. While goals for a superintendent nearly always do tackle the work of the district (and particularly in one the size of Worcester!), generally at least the professional practice goal is about the work of the superintendent as an individual educator, and how they are improving their own professional practice.1
In any case, these are the goals the School Committee set--evaluators in Massachusetts set the goals for educators being evaluated--and this report is the formative assessment of the superintendent evaluation cycle. The assessment is "formative" is because it's literally still being formed; this is the chance the educator being evaluated to talk through where they're at, and for the evaluator (in this case, the School Committee) to give feedback. They can even, if warranted, shift goals at this point in the cycle.But it isn't, of course, just there; you might remember the girl who was suspended from school in Louisiana after boys created content about her and she retaliated physically.
- SFGate reported earlier this week on Sam Nelson, who for 18 months took advice from ChatGPT on drug use, until he overdosed last May at the age of 19.
- Nearly every week, it seems, we have another article chronicling the mental health damage, frequently too late, that these products are doing to our children.
Yesterday, Google and Character,ai settled lawsuits from multiple families from multiple states over their children's suicides. No word that I could find on terms of the settlements, but we know two things: that it won't bring the children back, and that it isn't stopping companies from pushing these products to children. - As McGill University reported last month, AI-generated videos are misleading seniors about health issues. Benjamin Riley shared about the role of AI in the death of his own father earlier this week.
- violence is common;
- kids are growing up fast (and not in a good way);
- kids can't unplug from digital stress;
- the tech is causing rifts at home
When a teacher resists jumping on the AI-in-education bandwagon, they are not being timid or out of touch. When they plan their lessons and grade papers without AI, they are not wasting time. When they don’t teach students how to use it, they are not being irresponsible. By focusing not on their students’ ability to use generative AI but on their students’ ability to be generative and thus thrive in a world that can sustain them, they are absolutely thinking about the future.
| via the Washington Post Towns, of course, are something that the program literally just needs to copy off the map. |
I could go on and on and on (and have, if you follow the blog), but it is not getting better; it's doing more and more damage. As Audrey Watters says in her newsletter today:
...it really does boggle my mind there are still those who insist that they can wrest "AI" into "doing good," as if technofascism can readily be reshaped for any sort of truly "generative" purpose, as if one hundred years of teaching machines has brought us anywhere other than, to borrow from B. F. Skinner, a world now truly spiraling "beyond freedom and dignity."
And she links the Grok news to AI in general:
...the proliferation of “undressing” technology should remind us that the lack of consent is a fundamental element of "AI" – data and content taken without our permission, text and images "generated" without our permission, algorithmic decision-making without our permission, that little sparkly "AI" icon forced into our everyday software and thus everyday lives without our permission.
We are owed, here and elsewhere, a much more fundamental conversation about what we're doing in schools with this. It must be based, not on fear of falling behind, but on a clear-eyed view of what it's costing us.5
_______________________________________________
2among them that the Committee is not going to be able to evaluate this year on school redistricting, and multi-year goals can't be set for new superintendents...
And more to the point here, this is in no way a requirement of districts.
I haven't been this disappointed in so many since COVID.

No comments:
Post a Comment