Tech readiness: what do we know about Iowa’s readiness for statewide computer-adaptive testing and does it matter?
There is not much information being made publicly available about Iowa’s tech readiness for statewide administration of computer-adaptive testing. Whether the lack of publicly available information is due to failure to collect the information, a failure of transparency, or some of each is not clear. If anyone can provide answers or more details, comments, as always, are welcome.
One reason information on tech readiness matters is that switching over to statewide, online assessments is a huge undertaking. Wyoming’s experience in 2010 ought to be keeping Iowa proponents of Smarter Balanced Assessments up at night. The short version: online testing went so poorly that the state superintendent of education was voted out of office, the vendor (Pearson) was sued, and the state returned to paper-and-pencil tests.
Other stories of problems with online testing aren’t hard to find. See “Minn. Computer Crash Halts State Math Test” or “Online Testing Suffers Setbacks in Multiple States” (detailing problems with online testing in Indiana, Kentucky, Minnesota, and Oklahoma).
Conversely, Virginia apparently had few problems moving to online testing after spending 650 million dollars and taking six-years to roll it out. [Note that we are now less than one year out from the promised statewide use of the Smarter Balanced Assessments in Iowa.]
Planning for tech readiness obviously requires information about what you have now and what will be required for successful administration of the tests. To that end, PARCC and Smarter Balanced Assessments put together a Tech Readiness Tool to collect information on device indicators (meeting minimum or recommended requirements for testing), device to test-taker indicators, network indicators, and staff and personnel indicators to assist in planning for technology upgrades, test scheduling, and training for test administrators and tech support staff. In addition, SBAC offers a tech readiness calculator to help schools determine how many days of testing they will need to plan for given the number of students, computers, hours of computer availability, and bandwidth.
After completing a tech readiness assessment, Oklahoma determined they could not adequately prepare and chose to opt out of PARCC exams.
If Iowa has used the SBAC Tech Readiness Tool, that information and the results have not been made public.
Iowa is making public efforts focused on improving bandwidth capacity throughout the state. See comments by Director Brad Buck starting at 29:33 on #IAedchat on March 23rd (video below–other comments on Smarter Balanced Assessment field tests and the work of the Assessment Task Force start at 24:21–note also comments from Director Buck that he is unaware of efforts to determine whether Iowa schools have enough computers for testing).
Iowa conducted a SchoolSpeedTest earlier this year to gather information to “plan for future increases in access and support our planning for the next-generation assessments to be launched in 2014-15.”
The Iowa Governor’s STEM Advisory Council is also working on this issue and two broadband bills were filed this session (HF 2329 “Connect Every Iowan Act” and SF 2324 “Statewide Broadband Expansion Act“). While both of these bills still appear to be in committee (House Ways and Means and Senate Appropriations, respectively), I believe bills in either of these committees are not subject to funnel deadlines and could still be taken up this session. However, it seems unlikely that major infrastructure upgrades, if needed to improve access and capacity, would all be completed prior to next March (more on this later).
Of course, tech readiness is more complicated than just bandwidth. Schools must have enough computers, meeting minimum or recommended requirements, to test students during the testing window and schools must have adequate wiring to run all those computers (see “rural district decided to charge all its wheeled carts of laptop computers overnight, overloading the electrical circuits and shutting off heat in all its buildings.“) In addition, as noted in the stories linked above, vendor server capacity matters on the one end of the connection, and district server/network capacity matters on the other end of the connection. The following illustration is courtesy of AEA 267:
The 2013 Condition of Education Report (p. 109) shows the following information for bandwidth available in Iowa schools in 2012-13:
The STEM Advisory Council is recommending a short term goal of .5Mb per student, with a long term goal of 1 Mb per student. Iowa schools appear to be falling well short of these goals–150 schools have 10 Mb or less of bandwidth, with seven having no access to the internet at all–but if we look at bandwidth requirements for streaming video (again courtesy of AEA 267), it is easy to understand the STEM Advisory Council recommendations.
For schools with low bandwidth capacity, instructional use of the internet will have to be limited or eliminated during testing. Of course, for schools working with limited hardware, instructional use of computers might have to be limited or eliminated during testing anyway.
Why else does it matter?
We have already seen pushback on the Common Core in Iowa. Imagine how popular it will be if there are major problems with the Common Core assessment rollout. A high profile, comprehensive tech readiness plan might be reassuring (see New Hampshire’s website, for example) and minimize the likelihood of major problems occuring.
In addition, the DE can’t ask the Iowa Legislature to appropriate funds for tech readiness if they don’t know what is needed, leaving schools to fund needed tech readiness upgrades out of existing school budgets that are already stretched thin. This might be much less of a concern for schools ahead of the curve tech wise, but ought to be a concern nonetheless. Diverting local funds from the classroom or needed facility maintenance or upgrades will hardly make the Common Core and the accompanying standardized assessments more popular.
A paper-and-pencil option will be offered until 2016-17 for school not currently tech ready, but this option raises comparability and equity issues.
“With an adaptive test, you see right away what questions a kid needs,” said Lauress L. Wise, a principal scientist with the Monterey, Calif.-based Human Resources Research Organization, which has performed quality assurance and evaluation on testing systems such as the National Assessment of Educational Progress. “With paper and pencil, you’d have to offer a lot more questions—a longer test—to make it comparable to that. If you can’t do that, you won’t be measuring the end points [of achievement] as well.”
[SBAC executive director Joe] Willhoft acknowledged that the paper version of the Smarter Balanced test will be “less precise, with a larger measurement error” at those points in the spectrum.
These assessments will be used to compare Iowa schools to each other, with high stakes attached to the outcomes. As a matter of basic fairness, all Iowa students should be taking these assessments under essentially the same conditions.
Update: How much variability in testing conditions can occur before it affects the validity and reliability of the results? Rick Hess asks three questions in this regard that he says haven’t been answered:
How will we compare the results of students who take the assessment using a variety of different devices? There will be variability in screen sizes, keyboards, and potentially in the visual display. Some students will be using certain kinds of devices for the first time. And many states will be administering tests to some number of students using paper and pencil in 2015, and likely beyond. What do we know about how to account for all this variation in order to produce valid, reliable results?
While there are always questions about consistency of testing conditions, these get super-sized when the stakes climb and variation is non-random. Well, limited access to the required devices means that all the usual questions get accentuated. How will PARCC and SBAC account for vastly different testing conditions? Depending on testing infrastructure, some schools will be able to assess students in their regular classroom while other schools will have to shuffle students around the building, to schools across town, or to independent testing centers. How much does this matter? What do we know about how to track and then account for the impact of such factors on outcomes?
How will we account for the fact that we’re apparently looking at testing windows that will stretch over four or more weeks? Students in schools which administer the test towards the end of the testing window will have had a lot more instructional time than students in schools which test at the beginning. The variation could be 10 percent of the instructional year, or more. How is this going to be tracked and accounted for when comparing teachers, schools, programs, and vendors?
Field test side note: SBAC has released estimated times to complete the assessments based on pilot testing, with the caveat that the tests are not timed, so students may take more or less time to complete them. Smarter Balanced Assessments cover just mathematics and English language arts, so the time to administer the science assessments required by Iowa law would be in addition to these estimated times: