ATF: Apples to Apples

On Wednesday, members of the Assessment Task Force appeared in front of the House Education Committee for a question and answer session follow up to the task force presentation the week before. Unfortunately, I was unable to attend.

However, in preparation for this appearance, the task force created an additional document comparing assessment options and I do have a copy of this to share. Find the first page (overview) here and the remaining pages (comparison table) here.

I want to focus my comments here on the portion of the table comparing the costs of the first year of an assessment system based on either the NGIA or the Smarter Balanced assessments as the summative (end of year/accountability) assessment.

This table illustrates that the Smarter Balanced assessments systems of assessments are less expensive than NGIA-based assessment system, even though the Smarter Balanced assessments have a higher per student cost than the NGIA.

I think the following unstated assumptions are baked into those numbers:

  1. Iowa’s vendor will match Connecticut’s vendor’s pricing for vendor services.
  2. Technology costs are zero (either no district or state entity will require technology upgrades to administer statewide online assessments or it is a district decision to upgrade or purchase additional technology not properly attributable to the decision to choose an assessment requiring online administration).
  3. Costs of science assessments (summative/accountability plus multiple measure) are zero.
  4. Districts will either stop administering all of those other multiple measures assessments that they have selected for district use or the costs of those assessments shouldn’t be charged to the “assessment system” if they don’t.

This table I think hints at the difficulty we have had creating a completely fair, apples to apples cost comparison of the two assessments. One requires technology, one doesn’t. One has an additional cost for a paper and pencil option (but perhaps not many schools will use it?), one doesn’t. One has a science assessment, one doesn’t. One has a suite of assessment options, one doesn’t. The predecessor to one has additional reporting costs paid by the state which are not included in the per student cost, one doesn’t.

Should we be comparing costs of summative assessments only or assessment systems and why? Which costs should be counted and why? Should technology costs have been included or is it just something districts should be doing anyway?


2 thoughts on “ATF: Apples to Apples

  1. MaryMary Murphy

    There is an even bigger issue here that goes beyond comparing/contrasting the two choices as framed in the attached document. Smarter Balanced Assessments seem more likely to drive increased spending by districts due to curricula changes and especially changes in delivery of curricula than the next version of the Iowa Assessments. This is where the big money will be spent with SBA since the adoption of SBA would drive some or perhaps all Iowa districts to spend mega money on curricula materials and technology to run the new curricula materials that will be marketed as 1) better meeting the individual needs of students and 2) helping all students perform better on the SBA.

    The former (#1) won’t help students that test at the high end because standardized testing doesn’t reach this end, and even if it did, it would not matter because Iowa doesn’t spend much on talented and gifted programming and this isn’t likely to change. The same would be true for students who test at the low end (and I believe you pointed this out in an earlier post).

    The latter (#2) is no reason at all to move to a new assessment.

    I am also not a fan of having my children’s test data leave the state of Iowa (whether disguised in the aggregate or not) as the data collected will be used freely to develop educational materials for sale.

    1. Karen W Post author

      Thanks for your comments, Mary.

      I agree that there is a danger that performance on the assessment will become the overriding focus, with alignment of interim and formative assessments leading to purchases of curricula/teaching materials better aligned to the assessments. We might see fewer questions like which math textbook series best helps our students learn elementary math and prepares them for learning algebra? And more questions like which math textbook series best prepares our students for the assessments?

      I have seen the student data question pop up several times so I think at least a few legislators share your concern.

Comments are closed.