ATF: Process Matters

I have previously written about my concerns about whether task forces, boards, and commissions are operating in ways designed to obtain relatively independent, thorough, and fair study and deliberation of issues before recommendations are crafted or whether they are managed to build consensus around or buy-in for recommendations or decisions made ahead of time.

When the press only covers the recommendations–the results–of a task force, as it has in the case of the assessment task force, it isn’t necessarily easy to tell which sort of process was followed. Meeting notes are far from being transcripts and have not, in my opinion, adequately captured the quality, intensity, and extent of our discussions.*

Our perception that we were independent, thorough, and fair in our work is not an adequate substitute for the judgment of the public, for whom we work. If it were, open meetings and open records laws would be unnecessary as we could just assure you that we acted properly behind closed doors and just report out the results.

Here are some of the discussions and decisions that shaped the final recommendations that might have benefitted from public scrutiny:

  • discussion of DE conflict of interest (Jan. 2015)
  • decision not to review science assessments (Nov. 2014, Jan. 2015, Feb. 2015)
  • decision to open a new RFI after no vendor followed through with providing information on Smarter Balanced assessments under the initial RFI (June 2015)
  • decision to remove the combined ACT/ACT Aspire from further consideration (July 2015, Sept. 2015)
  • deliberations (Nov. 2015, Dec. 2015)

How did we do with our process? Did we get these decisions right? Were our deliberations thorough and fair? Did we raise and discuss the right issues? Does our process matter to your confidence in our recommendations?

Do we need better coverage and attendance at meetings of state level policymaking groups?What can be done to encourage coverage, especially as newsrooms are being downsized? Is there a role for bloggers here or technology–like the broadcast of school board and city council meetings and live audio- and video-streaming of the Iowa House and Senate?

*Nor have they always been timely posted; as of today, meeting notes for some meetings as far back as March are not yet posted.


10 thoughts on “ATF: Process Matters

  1. Mary

    Thank you for this information. Do you know what fund (e.g. general fund) school districts use to pay for assessments? I was also curious about what conflicts of interest came up? It does seem to me that there are large sums of money pouring into promoting technology in schools.

    My last question is whether there was any concern expressed about adaptive testing? I ask this because some of my children, especially when younger, would rush through standardized tests to finish them, and this behavior can result in students missing easy questions and not getting to answer more difficult ones.

    Again, thanks.

      1. Mary

        Thank you for the link Matt. To some extent, school districts have leeway to determine how to fund assessments; however, the local school district won’t be able to use just any fund since some are restricted to specific purposes and adoption of smarter balanced assessments could create and drive other expenses in addition to the purchase of the assessments.

        A couple of examples–1) to the extent the general fund is tapped to pay for assessment costs, there is less money to spend on teachers and 2) the PPEL fund can be used to pay for .equipment (among other things) such as some musical instruments and/or some technology purchases. There ought to be some consideration given to prioritization of what children truly need–e.g. would children be better off if a school purchased additional computers for testing or would improving children’s access to musical instruments be better?

        I’d like to see all expenses specifically quantified and prioritized against competing expenditures before any decisions are made.

        Again, thanks for the link.

  2. Karen W Post author

    Mary, thanks for commenting.

    The potential conflict of interest we spoke about was that the DE has signed a memorandum of understanding with SBAC, indicating that Iowa would use the Smarter Balanced assessments statewide beginning in 2014-2015. The assessment task force is supposed to be independent of the DE, yet a DE attorney advised the task force at the November 2013 meeting that we were not permitted by law to review science assessments. [Note: if science assessments had been reviewed, the Smarter Balanced assessments likely would not have been considered as they do not offer a science assessment.] This issue ended up being resolved without relying on the DE attorney’s legal interpretation.

    Interestingly, there was not much concern expressed about the adaptive testing. The Iowa Testing Programs did address your concern during the first interview, explaining that one reason they prefer a fixed form assessment is that it has been their experience that students can miss relatively easy questions and still correctly answer relatively difficult questions; the fixed form gives them the chance to show whether they can answer those more difficult questions.

  3. Michael Tilley

    Typically, adaptive testing isn’t all that adaptive. It would involve (something like) a 5% variance.

    Furthermore, they tend to be most effective at increasing awareness of small variations of ability level at the very highest and lowest levels (rather than for the great majority of students). They are not less effective at determining mid-level ability.

    My biggest concerns about adaptive testing are that you need a much larger item bank. This requirement may result in (a) the assessment group being unable to meet the demand or (b) watering down the quality in order to meet the demand. In any case, that is one reason for increased costs–that is, it cost more to make many more items.

  4. Mary

    Thank you for the interesting information.

    I have a lot of concerns about Iowa jumping to a new form of testing students, including, as you’ve articulated, the higher cost of testing and the likelihood that adopting this form of testing will increase educational spending on bureaucracy rather than in the classroom where it counts. Further, it doesn’t do much good for Iowa to adopt testing that allegedly does a better job of testing high and low ability levels if the state and local education associations don’t do anything with the results–I don’t see any additional money being spent on gifted programming in the schools, nor have I seen much discussion of how IEPs and accommodations will be handled. It also seems to me that if Iowa adopts computerized testing we will see increased use of technology in the classroom, and there should be some serious discussion about how this will impact teaching (and funding) in the classroom prior to adoption. As a practical matter, I still wonder how will this benefit my children?

    I also find concerning the multimillions of dollars the Gates Foundation is pouring into organizations like the Council of Chief State School Officers.

  5. Karen W Post author

    Mary, thanks for your additional (Jan. 7) comments.

    Schools may have to hire additional IT staff as they add new technology. Those costs probably come from the general fund, reducing the money available for hiring teachers and efforts to reduce or maintain class sizes.

    The ICCSD Revenue Purpose Statement passed in Feb. 2013 contains language allowing the SAVE/SILO monies to be used for the purchase, lease or lease-purchase of technology. So that’s another possible funding source.

    FWIW I was reminded at least several times that many districts don’t offer strings/orchestra programs and are cutting back world language offerings. I agree with Mary that there is an important values/priorities discussion to be had about what is “better” for Iowa’s children that we aren’t having now when we don’t quantify costs and talk about school budget trade offs.

  6. Mary

    Thanks Karen. I’d be interested to hear from my own school district, ICCSD, whether it is willing to commit any SAVE/SILO funds or PPEL funds to additional technology and specifically hear how it would fund any anticipated expenses, including staff, and what the opportunity costs to students are. I’d also like to see ICCSD have that discussion about priorities we didn’t see during last year’s budget cuts. I can see these assessments, if adopted, also driving costs other than technology over time. I appreciate your dissent.

    Do you know what grade these smarter balanced assessments would start at, if adopted, and whether the results would arrive in parents’ hands more quickly than the Iowa Assessments?

    1. Karen W Post author

      Starting in the 2016-2017 school year, by state law, grades three through eleven will be tested. Smarter Balanced assessments start at grade three.

      There is a lot of talk about getting assessment results faster with Smarter Balanced but we didn’t really talk this through with the vendor. It sounds like computer-scored results could be available to teachers (schools?) either immediately after the student completes the test or after the last student to be tested at the school completes their test. This would only be a preliminary (partial) score though, because it would not include the results of the performance task portions, which almost certainly will need to be scored by humans. I think there may be an expectation of two weeks turn around on scoring the performance task portion. Then I think raw scores have to be converted to scaled scores, which are the scores, in addition to achievement levels, that will be reported to parents.

      I’m not sure what purpose is served by releasing preliminary or partial scores and if they would be raw scores or scaled scores. But I have heard (but did not confirm with ITP during the interviews) that the delay in Iowa Assessment results reporting to parents relates to the time it takes the district to prepare the tests to be shipped back for scoring and then whatever work the district does with the results before releasing them to the parents–the actual turn around time for the scoring of the Iowa Assessments is just a few weeks.

      If Matt wants to jump in here, he certainly knows more about how this works in practice (other districts may have a different experience with turn around time for Iowa Assessments than what I have heard). I think the advantage of the computer-based administration is to eliminate the work of preparing paper answer sheets to be returned for scoring and the transportation time. That time and work gets shifted, pretest, to the IT staff, who have to prepare computers for testing (securing the computers for test security purposes, setting up accommodations, setting up earbuds/headphones/external keyboards (if needed for tablets/iPads), making sure all devices are charged) and setting up passwords/logins for all students to be tested.

      I’m not sure that any of this changes the delay between districts receiving the scores and releasing score reports to parents.

      1. Matt T.

        Current Iowa Assessment turn-around time:
        Our school usually has a week in which all students take the Iowa Assessments together. Following that week, students absent for one or more assessments are usually caught up as quickly as possible, however this can take a week (or two) depending on the length of absences, extended illnesses, etc. Adequate Yearly Programs (AYP) requirements from No Child Left Behind (NCLB) includes holding districts accountable for assessment participation rates, therefore ensuring as many student assessments are submitted for scoring becomes important. Finally, all materials are collected and counted before being sent in to Iowa Testing for scoring.

        Our district usually tests the second week of February and if we are efficient at the district level, we can have results back to us by the second or third week of March. (There was one year — the first year of the current assessment form — in which Iowa Testing took much longer than usual to score the assessments). We then do a quick check for accuracy and load the data into our computer system before releasing the individual handouts to students/parents. From start to finish, I think it’s reasonable for us to see a six week turnaround from students completing the Iowa Assessments. All this may be different for districts larger in size(?).

        Karen said, “I think the advantage of the computer-based administration is to eliminate the work of preparing paper answer sheets to be returned for scoring and the transportation time.” This seems accurate to me. Based on other online assessments districts may already be using (i.e. NWEA MAP, FAST, EasyCBM, etc.), it is also reasonable to assume there will be some IT prep time. I wonder how the IT prep for Smarter Balanced Assessments would be similar to or different from the prep IT staff are already doing for other online assessments currently in place? Districts that piloted SBAC last school year may be able to provide an insight (my district did not participate in the pilot).

        Karen also said, “I’m not sure what purpose is served by releasing preliminary or partial scores and if they would be raw scores or scaled scores. ” I agree with this statement.

Comments are closed.