Update: the web application showing student proficiency and college and career readiness growth (CCR growth) by school or by district is now available. The “more information” links aren’t working for me, but here’s an alternate link to FAQs and to the DE news update page.
Or at least, so says my Twitter feed:
Details are scarce at the DE website for now, though the January 2015 School Leader Update (page 6) indicates a website will be activated this month that will provide data on academic growth and student proficiency rates for Iowa public schools and districts. These are just two of the nine metrics required to be part of Iowa’s attendance center ranking system (Iowa ACR) by HF 215, the major education reform bill passed in 2013. A full report will not be released until October 2015, and the January 2015 SAI Report indicates that work to determine relative weightings of the criteria for purposes of calculating final scores is ongoing.
The DE published its Attendance Center Performance Ranking Legislative Report last July, listing and describing the following required nine metrics:
Student proficiency: the DE proposes using the NCLB proficiency calculations based on Iowa Assessments scores, currently in use for federal reporting, for this metric.
Student academic growth: the DE proposes to calculate a growth target for each student based on the previous year’s score that would allow the student to earn a college ready cut score in grade twelve. Lower scoring students would have larger growth targets (need to gain more points per year) to reach the college ready cut score; students already earning college ready cut scores would have a growth goal of “the annual increase in observed growth at the 50th percentile for the student’s current grade.” (page 11). This metric would be the percent of students in the school building meeting their individual growth targets. This would seem to disadvantage schools that produce at least one year’s growth or more for lower performing students if that growth nonetheless falls short of the, perhaps, very large annual gains required for the students to earn a college ready cut score in grade twelve. And it would also seem to fail to distinguish between schools with lower performing students making very little growth and schools with lower performing students making at least one year’s growth; both could have the same percent meeting the standard, but one arguably is doing a much better job than the other.
Note also that only 11% of grade eleven students are expected to earn Smarter Balanced college ready cut scores (Level 4) based on field test results. From page 11 of the report, “[T]he work group determined that the proposed model using the trajectory toward post-secondary success was rigorous, attainable, and meaningfully aligned with the State Board of Education’s goal that “Individuals will pursue post-secondary education to drive economic success.”” Considering that only 11% of grade eleven students are expected to earn Smarter Balanced college ready cut scores (Level 4) based on field test results, we may have to settle for the idea that two out of three ain’t bad (rigorous and meaningfully aligned but not realistically attainable). Or hope that the work group has rethought this one in the intervening months.
Graduation rates: the DE proposes to calculate four-year, five-year, six-year, and seven-year graduation rates, using the highest of the rates for ranking purposes.
Attendance rates: student days present divided by student days enrolled.
Parent involvement: the DE proposes to survey school staff to collect this information. If parents are not also surveyed, it is hard to see why all schools wouldn’t earn full points on this metric (hint: answer that you strongly agree with/regularly do all of these things). The report discusses surveying parents for a parent engagement metric and a parent satisfaction metric, but later the report recommends against adding optional metrics. We will have to wait and see on this one.
Employee turnover: number of licensed staff members (including administrators, counselors etc.) employed the previous year and still working in the building in the current year divided by the total number of licenses staff members working in the building in the current year.
Community activities and involvement: the work team was still wrestling with what might be included in this metric; we’ll have to wait and see.
Closing gap score: from page 20, the DE proposed a single, super subgroup “consisting of the students who are identified as one or more of IEP, ELL, and FRL will be evaluated.” “[T]he percent of the single supergroup in the general population will be compared to the percent of that supergroup’s representation among the proficient students.” Then, this gap score would be compared to the previous year’s score to determine if gaps are changing for the better or for the worse.
College-readiness rates: the DE proposed using Iowa Assessments scores linked to predicted ACT college-readiness cuts scores on the ACT of 22 in reading and 22 in mathematics. This would be changed depending upon changes to the accountability assessments.
The work team also considered optional indicators including post-graduation data, suspension and expulsions rates, level of student engagement, parent satisfaction, parent engagement, and staff working conditions. As of July 2014, the work team recommended not including these optional indicators in the Iowa ACR but planned to work with stakeholders to consider whether additional indicators should be included. On the upside, there doesn’t seem to any AP/PSEO participation metric in the rankings, at least not yet. On the downside, it isn’t at all clear how any of this will help schools improve, but at least those of us whose children attend schools at the top of the rankings will have something to celebrate on Facebook. At any rate, it will be interesting to see how much has changed since last July when the full reports are released to the public in October.