Chester:
"not everyone is in agreement on what would be the right course of action"
"very strongly about the need to move and to move expiditiously"
two year timeline: full set of experience with the PARCC assessment
"we've had quite a bit of experience and feedback and you've heard a lot of it"
"the more I've thought of it, the sharper my thinking has become"
"depth and range of feedback you're getting"
systematic analyses, strongly held opinions
"have been concerned that we identify the forest for the trees"
Three principles as we move forward (this from Chester):
- MCAS has reached the point of diminished returns. "Last year was the 18th year we'd administered MCAS" having greatest impact on places where there's a lot of energy on having students succeed on a test rather than being successful.
- Having looked at an awful lot of information on PARCC, "it's clear to me that PARCC is a strong advancement on MCAS"
- Whatever path we pursue, "we need to be in control of our destiny." Opens options beyond a binary decision
"what's been colloquially referred to as Door 3"
"capitalizing on the investment we've made in PARCC"
"Massachusetts has played a leadership role...how can we capitalize on it"
Well over $100M investment, can't replicate
capitalizes on PARCC development, but we as a Commonwealth decide what is and isn't in it
Owe it to our educators, students, families, "to move ahead, as standing still with MCAS is not a path that I would recommend"
Bob Lee and Jeff Wulfson:
Lee; trying to provide some context for these results
98% PARCC test takers (not many opt outers)
selected statistically to represent state population from 2014 both demographically and in academic achievement
"though their accountability level can't go down, we'll be calculating a percentile for them, and those percentiles will be public record"
also need context for these (given history)
MCAS and PARCC have different levels of achievement; MCAS has 4 levels, PARCC has five
"not that they're the same things"
"seeing about 50 to 60% of students meeting expectations on PARCC" in grades 3-8 (which is where most of the results are)
in math, about half: (all grades 52%) (80% of eighth graders who took algebra I, BTW)
in high school, numbers are kind of all over the place (?) for PARCC, but no higher than 49%
91% of students in MCAS ELA; 79% in math in MCAS
slide that tries to compare percentages
"what was a 68 on MCAS compares to a 60" in PARCC though Fryer points out that this is not the case
"that's not to say our proficiency went down, but there are differences in meeting expectations"
difference of about 8 points
Fryer: 60 doesn't equal 68; it doesn't work that way
because they're different cut points; math cut the test into deciles
distributions are matching up across the test
Peyser: is there something you could say about the percentage of raw score points you need in one rather than the other?
Lee: somewhat relevant, cut scores 60-70 percent
"when we calibrated our expectations, and we talked to our professors and our teachers"
"47% is a really good score on the PARCC test"
lot of score points on the PARCC test, lot of good information
have .3 correlation with grade point average
Doherty: reading at grade level by grade 3, is what that means to be
"how do I interpret that in reference to reading on grade level?"
Lee; probably not your best source
Chester: with PARCC, there was a standard setting process that began at the 11th grade
"what part of the 11th grade test to you need to master by" lower grades to then succeed at a higher college course
PARCC was very deliberate in terms of lining that up, grade by grade
Noyce: "we've talked about grade level for years as if we knew what that met"
this is a definition: "we talk about these things as if they have some meaning, but they have the meaning that we give them"
Lee: there was a process, they looked at very detailed standards
Morton: for MCAS districts, there was 10% fewer low income students
Lee: that's why we normalized the sample
Morton: were you able to determine impact of poverty on achievement
Lee: matched first on achievement, then on demographics
Morton: I want to know if PARCC adversely impacts low income students, compared to MCAS
Sagan: "and 'struggling' isn't the word I would use...I would use 'not meeting expectations.'"
Chester: fewer students across the board are getting to those standards
"so this becomes a pretty interesting discussion...does the gap widen or shrink...and" what does that mean
"let's not look for a test that eliminates the gap"
essentially he then here talks of a test that just tests content and thus gives pure resutls
Lee; rigorous bias review process for items
"strange to say a kid is not proficicent one year and is proficient the next without his having a particularly good year"
Lee: compare how kids did to how kids did on a test we know better: correlates with 9th grade MCAS
now there's a chart that basically shows that if you take MCAS results and take PARCC results, they do correlate at a beyond statistically significant level
Wulfson adds cavet on relative seriousness with which test was taken, which is protested by Willyard (the student rep) that he doesn't like the insinuation that any students might not take the test seriously
Peyser: we're comparing a four performance to a five performance scale
Lee: have two scales that work and you've broken them up in different ways
pretty massive gap in Algebra II and 11th grade ELA
Lee; "these are tests that don't correlate as much"
Willyard: would this mean that the cut scores need to be reevaluated?
Chester; they can both be equally correclated by cover very different content
Fryer: text complexity, other is different cut points on a test
"if I were to do the exact same thing, and give it six levels and call it PARCC revised, I could say 'Look, look at the complexity'"
Sagan: but do you want to say that 90% pass? Whatever that number is?
Moriarity: if we go with PARCC, does that mean that there will be three math graduation requirements?
Sagan: change in high school requirement waived for three more years
rules won't change partway through the process
Peyser: if these data are right...
Sagan: "wouldn't it line up with the large numbers of students being put into remediation?"
Noyce: there is a strong movement among other states in the country at back mapping on math to what you study in college
"we should as a state really not try to tie ourselves to the Calculus track"
and now comparison with NAEP
And now on PARCC and technology (presentation by Kenneth Klau, digital learning)
minimum needed to administer PARCC online:
For technology in general (beyond PARCC):
$90 (something)M for upgrading school technology
$67 and $278M for devices
Wulfson: behind much of the country, as much as we see ourselves as a technological state
what resources are out there? Erate being the best
districts applying, regardless of testing decisions
Digital connections partnerships schools grant: state bond batching program
sliding scale where state match funds for infrastructure, local money used for infrastructure, devices, PD, or assistive technology for students
Stewart: process out loud
had been rethinking MCAS in 2008, but went way of RTTT
"not a test per se, but college and career readiness for students"
tension between what is necessary and important wellbeing and what is necessary for the future economy
don't have a clear vision of 21st century skills look like
"no test ensures great instruction"
test to inform instruction, and how best to do it
"how do we go back and replan, retool for what we want to do with regards to assessment"
Chester: not at ground zero...and now we're reviewing 20 years of ed reform...
"this is not about starting a new direction; it's about continuing."
Stewart: MA has set standard at level 4; Ohio has set it at Level 3. How do we have common assessment?
Sagan: 'as long as you can compare...you can compare if it's the same data"
Bob Lee and Jeff Wulfson:
Lee; trying to provide some context for these results
98% PARCC test takers (not many opt outers)
selected statistically to represent state population from 2014 both demographically and in academic achievement
"though their accountability level can't go down, we'll be calculating a percentile for them, and those percentiles will be public record"
also need context for these (given history)
MCAS and PARCC have different levels of achievement; MCAS has 4 levels, PARCC has five
"not that they're the same things"
"seeing about 50 to 60% of students meeting expectations on PARCC" in grades 3-8 (which is where most of the results are)
in math, about half: (all grades 52%) (80% of eighth graders who took algebra I, BTW)
in high school, numbers are kind of all over the place (?) for PARCC, but no higher than 49%
91% of students in MCAS ELA; 79% in math in MCAS
slide that tries to compare percentages
"what was a 68 on MCAS compares to a 60" in PARCC though Fryer points out that this is not the case
"that's not to say our proficiency went down, but there are differences in meeting expectations"
difference of about 8 points
Fryer: 60 doesn't equal 68; it doesn't work that way
because they're different cut points; math cut the test into deciles
distributions are matching up across the test
Peyser: is there something you could say about the percentage of raw score points you need in one rather than the other?
Lee: somewhat relevant, cut scores 60-70 percent
"when we calibrated our expectations, and we talked to our professors and our teachers"
"47% is a really good score on the PARCC test"
lot of score points on the PARCC test, lot of good information
have .3 correlation with grade point average
Doherty: reading at grade level by grade 3, is what that means to be
"how do I interpret that in reference to reading on grade level?"
Lee; probably not your best source
Chester: with PARCC, there was a standard setting process that began at the 11th grade
"what part of the 11th grade test to you need to master by" lower grades to then succeed at a higher college course
PARCC was very deliberate in terms of lining that up, grade by grade
Noyce: "we've talked about grade level for years as if we knew what that met"
this is a definition: "we talk about these things as if they have some meaning, but they have the meaning that we give them"
Lee: there was a process, they looked at very detailed standards
Morton: for MCAS districts, there was 10% fewer low income students
Lee: that's why we normalized the sample
Morton: were you able to determine impact of poverty on achievement
Lee: matched first on achievement, then on demographics
Morton: I want to know if PARCC adversely impacts low income students, compared to MCAS
Sagan: "and 'struggling' isn't the word I would use...I would use 'not meeting expectations.'"
Chester: fewer students across the board are getting to those standards
"so this becomes a pretty interesting discussion...does the gap widen or shrink...and" what does that mean
"let's not look for a test that eliminates the gap"
essentially he then here talks of a test that just tests content and thus gives pure resutls
Lee; rigorous bias review process for items
"strange to say a kid is not proficicent one year and is proficient the next without his having a particularly good year"
Lee: compare how kids did to how kids did on a test we know better: correlates with 9th grade MCAS
now there's a chart that basically shows that if you take MCAS results and take PARCC results, they do correlate at a beyond statistically significant level
Wulfson adds cavet on relative seriousness with which test was taken, which is protested by Willyard (the student rep) that he doesn't like the insinuation that any students might not take the test seriously
Peyser: we're comparing a four performance to a five performance scale
Lee: have two scales that work and you've broken them up in different ways
pretty massive gap in Algebra II and 11th grade ELA
Lee; "these are tests that don't correlate as much"
Willyard: would this mean that the cut scores need to be reevaluated?
Chester; they can both be equally correclated by cover very different content
Fryer: text complexity, other is different cut points on a test
"if I were to do the exact same thing, and give it six levels and call it PARCC revised, I could say 'Look, look at the complexity'"
Sagan: but do you want to say that 90% pass? Whatever that number is?
Moriarity: if we go with PARCC, does that mean that there will be three math graduation requirements?
Sagan: change in high school requirement waived for three more years
rules won't change partway through the process
Peyser: if these data are right...
Sagan: "wouldn't it line up with the large numbers of students being put into remediation?"
Noyce: there is a strong movement among other states in the country at back mapping on math to what you study in college
"we should as a state really not try to tie ourselves to the Calculus track"
and now comparison with NAEP
And now on PARCC and technology (presentation by Kenneth Klau, digital learning)
minimum needed to administer PARCC online:
- internet connection to download test to each computer prior to testing
- sufficient bandwidth (at least 5kbps per student)
- sufficient computers (desktops, laptops, netbooks, thin clients (?), and tablets) with a web browser and an input device (keyboard with mouse or touchpad)
20 day testing window x 3 sessions / day= 60 sessions
each student sits up to 7 sesssions
number of students x 7 sessions = total "seats"
for example: a school with 721 students has 85 testing seats & therefore needs computers to give tests during the window (721 x 7/60=84.1...thus 85)
how many classrooms? 5
Statewide estimate need for $3.1M to $12.3M (based on type, ranging $250 to $1000)
- 304 schools need less than 30
- 85 need 31-60
- 37 need 61-90
- 13 schools need 91-120
- 3 need 121-150
infrastructure needs of $2.4M, includes broadband costs for schools that do not meet proctor caching requirements
Sagan: do you have a map? yes, but not yet, and you'll be able to zoom in and out to the individual levelFor technology in general (beyond PARCC):
$90 (something)M for upgrading school technology
$67 and $278M for devices
Wulfson: behind much of the country, as much as we see ourselves as a technological state
districts applying, regardless of testing decisions
Digital connections partnerships schools grant: state bond batching program
sliding scale where state match funds for infrastructure, local money used for infrastructure, devices, PD, or assistive technology for students
Stewart: process out loud
had been rethinking MCAS in 2008, but went way of RTTT
"not a test per se, but college and career readiness for students"
tension between what is necessary and important wellbeing and what is necessary for the future economy
don't have a clear vision of 21st century skills look like
"no test ensures great instruction"
test to inform instruction, and how best to do it
"how do we go back and replan, retool for what we want to do with regards to assessment"
Chester: not at ground zero...and now we're reviewing 20 years of ed reform...
"this is not about starting a new direction; it's about continuing."
Stewart: MA has set standard at level 4; Ohio has set it at Level 3. How do we have common assessment?
Sagan: 'as long as you can compare...you can compare if it's the same data"
No comments:
Post a Comment
Note that comments on this blog are moderated.