ATF: Legislative Questions

I blogged previously about questions legislators asked during the Assessment Task Force presentations to the Senate and House education committees. Prior to the education committee presentations, legislators submitted the twenty-six written questions about the Assessment Task Force recommendations. The questions–along with the task force answers–are available here.

Under question one (about technology needs) the task force response notes:

Technical requirements for the Measures of Academic Progress (MAP) testing are similar to those for the Smarter Balanced assessments. Many schools in Iowa with years of experience using this platform to assess students (over 60% of districts sampled on a recent survey on assessment practices indicated they used the MAP assessments). We can learn from them.

The task force references MAP testing for a few different purposes in this document (technology, cost comparison). I should note that the task force as a whole group never reviewed the MAP tests–or other assessments used for multiple measures purposes–or compared them to SBAC interim assessments, though undoubtedly many of the other task force members are familiar with these assessments through their work as educators.

Part of the technical and logistical challenge of the Smarter Balanced assessments is the length of the assessments–needing to schedule all grade three through eleven students for 7 to 8.5 hours of testing within the fixed testing window. The MAP assessments are much shorter, running 20-30 minutes for a survey test or 60-75 minutes for a goals survey test (per subject area).

For question two (in school bandwidth issues) the task force suggests a paper/pencil option or busing students to another location as options for schools with insufficient bandwidth. It should be noted that the paper/pencil format will cost Washington schools an additional six dollars per subject test or twelve dollars total per student. As for busing, it adds to the loss of instructional time and would have to be done for multiple testing sessions as the Smarter Balanced assessments are much too long for students to take in one testing session.

Under question six (about possible savings if the full suite of SBAC assessments are adopted) the task force provides costs for MAP tests. Districts can use MAP tests to meet the multiple measures requirements for math and reading ($12.50 per student) plus science (an additional $2.50) per student. Which means the assessment requirements could be met for a district for $30 per student (NGIA at $15 plus MAP with science for $15). To match that cost with SBAC, school districts would need to obtain two science assessments for a total of $2.70 per student, which frankly seems unlikely.

Under question nine (about measuring growth for students at very low and very high levels of performance) the task force notes that while NGIA will reportedly be grade level items only,:

The Smarter Balanced Assessments utilize an adaptive testing model that measures on a broad continuum of skills. The scaling allows interpretation of growth against that broad continuum. Thus, a student performing at a very low level will receive test items measuring their proximal level of development. As the student improves over time, more difficult items will be delivered, with all results on the same scale, allowing interpretation of student growth. A student performing well above grade level will have a similar experience, but with items more in line with his/her advanced skills.

This task force understanding of the promise of computer-adaptive testing for the lowest and highest performing students–which I shared–seems to be somewhat at odds with this explanation provided in a new white paper commissioned by SBAC, “Making Good Use of New Assessments: Interpreting and Using Scores From the Smarter Balanced Assessments Consortium“, dated March 2015:

Although Smarter Balanced uses computer adaptive testing, and although test scores will be arrayed on a vertical scale, the tests will not measure the full continuum of achievement along which students actually fall. As noted above, Federal rules require that students be tested with items associated with grade level content and skills and most students will encounter only those items. Based on their performance on the bulk of the test, the CAT will direct some students to items just above or below grade level. However, the test will not measure the actual skill levels of students whose achievement is far above or far below their grade levels, since they will not encounter many items that accurately measure their actual levels of achievement. Thus, while the test will accurately describe the student’s knowledge of grade level and near–grade level content, the test will not be as sensitive a measure of growth for these students.

As an example, an 8th grade math teacher whose students are functioning with 4th grade skills will have to teach requisite computational skills while also seeking to teach algebraic concepts. Given the design parameters associated with Smarter Balanced assessments, gains that students make in these foundational computational skills will not be specifically measured on the tests, because students will not encounter items that would evaluate their growth of these specific skills. The growth metric that could be produced from Smarter Balanced tests will only provide information about the extent to which students’ scores changed within the region of their grade level, not a full measure of their starting and ending skills at two points in time.

In other words, the Smarter Balanced assessments may not be as different from the Iowa Assessments with regard to above and below grade level items as we–or at least I–thought they might be.

Under question ten, the task force declines to choose between adequate funding for foundation support or the assessments. As the sole dissenter, I will speak for myself and say that this isn’t even a close question for me. Adequate funding for instructional programming is a higher priority than funding assessments that are more expensive and time consuming than necessary to meet assessment requirements.

Under question fourteen, the task force suggests that the additional hours spent an SBAC assessments would have an insignificant impact on instructional time. I’ll note just a few things here: 1) that no one is talking about adding 1.5 minutes of testing per day, 2) that either the full-length interim assessment or practice test is likely to be given so that students have an opportunity to become familiar with the test format and software plus the performance task items–so you can at least double that extra four hours, and 3) students would still need to take two additional science assessments of unknown duration.

Under question twenty-six, the task force does not provide information on the experience other states have had with the roll-out of new exams. If you are a regular reader, you will know that states already using statewide online assessments are having an easier time rolling out the new assessments than states that aren’t, that states and districts have been spending a lot of money on technology to prepare for online assessments, and that even so they are encountering a variety of technology issues large and small.

This isn’t necessarily an argument against adopting a statewide online assessment but it is an argument that a move to statewide online assessments should be done with our eyes open, a comprehensive statewide technology plan aimed at ensuring all schools can build and maintain adequate technology infrastructure for equitable testing experiences for all students, and adequate funding to pay for it all without instructional cuts.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s