Tuesday, September 22, 2015

Student Assessment report: recap at the Board of Ed

backup here
Chester: fiscal implications and final results on study on testing time for today
report: cost variables around assessment choice
spending has risen: spending $24M on state appropriation this year, another $13M in federal grants on assessment
FY14: $32M
FY15: $37M
FY16: $37M
less than 0.5% of our annual appropriation
includes DESE staff and assessment contracts
"Very important dollars to the program, but very few dollars to the state budget"
contracts are with Pearson [$14M] (for PARCC), Measured Progress (for MCAS)[$16.6M], and WIDA [$2.1M] (for ACCESS)
predicting future costs depends on numerous variables:
  • PARCC is driven by student volume (beyond MA) and optional services chosen; price agreement with Pearson is good through FY18, possible to renegotiate in FY17; how many computer v paper affects costs. Computer based test is $24; paper based is $33 (for grades 3-8, both subjects together per student); computer based is $12.50; paper is $18 (for high school per student)
  • MCAS needs a new procurement in 2016-17 school year; revision to current tests and addition of grades 9 & 11. Any changes would have to go out to bid. Not a true representation until we specified what we were looking for. "looking at a good solid year to pull it off well" though Chester adds they'll pull together "likely benchmarks" for Board. Not "an apples to apples" comparison, but about $42 per student (which is from FY14, when all students were taking MCAS). We're in the first of two one year extension periods.
"Presently there is no clear conclusion that either assessment is more or less expensive than the other."
cost models depend on what we want to assume: more computers? less human scoring?
Chester: artificial intelligence scoring; can teach a computer to score essays
"our assumption is that we would want to continue human scoring until we're certain" of the artificial intelligence scoring
Willyard "for the cost is worth giving up control, if PARCC is less expensive"
Fryer: shouldn't take the numbers too literally at this point, "I'd imagine Pearson at least knows" how much each state in saves us
"what are the economies of scope? What are we getting economically from this consortium?"
Chester: primary consideration should be the quality of the assessment
PARCC was paid for, not by states, but by the federal government "about $185M development effort"
If we were to develop this "it would be a pretty substantial cost"
Fryer: should separate this into a discussion about governance and a discussion of the quality of the test
"if we don't know the costs and we don't know the quality in five years" (on PARCC)
McKenna; who owns intellectual property?
Wulfson: owned by consortium of states, make it available to states at a cost who are not a member of the consortium
McKenna: interesting option, wouldn't be under consortium, but could use work
Wulfson: wrestling with degree to which having other states do that weakens the larger point of consortium
Peyser: can you be in the consortium and not contract with Pearson? No.
Wulfson: online test which uses the Pearson online platform; cost to develop our own would be prohibitive
Willyard: could use platform of Pearson without PARCC content? possibly...
"but there's options, then"
response to Fryer: three tiers of pricing, those are in the bottom tier, 'though we're slightly below the bottom tier, but Pearson is holding the price
"consortium could grow in future years, as well"
McKenna: anything that protects the upside of costs?
Sagan "the Hotel California problem"
McKenna: they can only raise costs by X amount ever
Wulfson: state procurment, can only have contracts for so many years
have made several changes in testing contractors over the years
for the life of the contract you're protected on the price
Stewart: saw stats on district support for broadband etc; any sense in how long it will take to get everyone there?
Wulfson: question of politcal will, not just suburbs funding, but some of urban areas
question of what this board sets as a policy
if we set it by 2017, "I have no doubt that we could make that happen"
Chester: whole conversation about if there should be one contract or should there be more than one model
Smarter Balanced (SBAC) is the other
very different models: governance does not involve Commissioners, test is held by UCLA, each state finds test deliverer
"Smarter Balanced experience have been incredibly variable"
Nevada in lawsuit over it; others had a great experience
"very little confidence in comperability"
"PARCC consortium has had a very successful administration"
"nobody crashed and burned"
80% of students (across states) took it online
McKenna: how many states are in SBAC? 18
Peyser: about cost: numbers for various components within PARCC; questions averaging
has there been a negotiation in the services that are provided? Same services of MCAS and PARCC
Wulfson: hard one to answer, tests at very different stages
Morton asks for some guidance on the governance issues: need to know what control we'll have, what control we'll lose, what the impact on costs will be
Wulfson: will happen next month
Morton: interested in school districts that are not up to speed on technology
Sagan asks that it be by area as well as demographic
Chester: next month, Monday and as much of Tuesday as necessary for dicussion
a number of studies to look at
"hope to bring some educators in to look at the assessment"
"...going to be snowed with opinion about what you should and shouldn't do. All of that is important. Some of that opinion is based on actual experience with the test. Some is based on those with no experience with the test."
Sagan: "sure that we aren't getting out on time" at the October meeting
McKenna asks that the reports coming out from the Secretary's office be shared as early as possible
Fryer asks for specifics about governance model on paper; Sagan wants to hand it to the "guy from the consortium" and see if he agrees with what we think is the governance model
Willyard asks if we can reach out to states that have dropped out; back and forth over if that's political grimace from the Commissioner at this
Stewart: impact on teaching and learning at schools on decision
Chester: "you've had some phenominal experience at that in the Lexington School Committee"
shine the light on achievement gaps "at what are otherwise lighthouse districts"
"in addition there's a lot of feedback that educators get"


PARCC perceptions
survey done by Stand for Children/MassInc of principals (285)
TeacherPlus teacher survey (1014 teachers from across the country)
computer based survey of students during PARCC (about 127,000 students)
survey of test administrators (693 for two)
pricipals: nobody thinks PARCC is less demanding; 40% think it will better assess critical thinking
TeachPlus (not a representative sample) 72% of the ones from MA thought it was a higher quality assessment
McKenna: were these TeachPlus teachers?
PD was run by TeachPlus? McKenna points out that TeachPlus has come out strongly in favor of PARCC
Doherty: "it's not a grain of sand you have to take it with; it's a bucket of sand"
McKenna: "they have a certain ideology that some teachers would adhere to and some would not"
more and more test administrators have used a computer based test
only about half feel that the online training got them ready to administer the test
asked administrators if most finished early/on time: large majority finished on time or early (a little higher on computer based rather than paper)
students said most questions were not on things that they hadn't learned
half/ish say it was easier or same as their schoolwork
almost all finished early or online
about 2/3rds said that they preferred computer
a quarter said that they had technology interface issues
Chester: raised question if adults are putting a ceiling on what students are capable of; adults concerned when kids were not
Fryer asking for the rest of the responses

District assessment practices: 
surveyed 38% of superintendents
interviewed a representative sample of 35 districts
4 districts had case studies more in depth
challenging for people to respond off the cuff as to how much time they spend on things
find comparable for student performance
most commonly used for "addressing student needs"
state required: two tests, two sessions per test
elementary average 8.3 sessions
middle 7.5 sessions
high school 5.6 sessions
Sagan: consistent in time? No, approxiamate
Peyser: 19 sessions is summative across all grades,
district 60%, state 40%
interesting variation across these districts: one district spending 1 day up through grade 8 and 4 days grade 9-12; 8/12 days in elementary
THIS ISN'T POSSIBLE; THE MCAS TAKES LONGER THAN THAT!
Willyard: district 4 I can relate to the most; adds up his days of finals, actual MCAS & PARCC, mock MCAS and PARCC
"when we talk about only a couple of days, five or less...a lot of [students] chuckled at this"
"do we know if this really the sentiment of all schools in the Commonwealth"
Sagan: it's hard to reconcile this with the experience in the classroom
preparation for MCAS assessment: curriculum and instruction, some test-taking strategies
how to plan for schedule of MCAS: districts allow schools to set schedule, little new content on test days, "minimal disruption for most non-tested students"
Noyce: what should we do?
Chester: the longest one, isn't an extreme isolated example
assessments that aren't particularly contributing to instruction so districts could abandon
Fryer: compare with student growth and share best practices
Chester: this goes back quite a ways
often a mindset that you do some very narrow preparation
"my understanding is that it is those insitutions that aim high in terms of intellectual mission...don't need that narrow prep"
Moriarity: recommendation against holding assemblies; it's counterproductive


No comments: