Chicago Public Schools quietly changed some growth scores from standardized test results released in August, resulting in a rise in school ratings for seven charter schools and a dip for nine others, the district has acknowledged.

Changes were made to growth scores for many of the district’s charter schools but affected fewer than 20 CPS-operated schools that took the Northwest Evaluation Association’s Measures of Academic Progress in 2013, though the district never informed the public nor noted the changes online where the data is kept.

AAPPLE, an activist arm of the Chicago Principals and Administrators Association, pointed out the discrepancies after school ratings were released last week — differences the Sun-Times confirmed in its own analysis.

The ratings marked the first time that all schools were supposed to be evaluated using the same standards — the same tests and measures for all schools, whether run by the district, by a turnaround company or by a charter operator.

Following a meeting Tuesday with CPS officials, Chicago Teachers Union representatives and the head of the principals association, AAPPLE head Troy LaRaviere called for a full investigation by the district’s inspector general.

“In a system based on ‘choice,’ parents and other stakeholders must be provided with accurate indicators of school quality. [CPS’ ratings system] cannot serve this purpose if there are clouds of suspicion about tampering with the data used to determine these ratings,” LaRaviere said in an email.

Furthermore, he said, “the changing of scores happened without any public disclosure.” CPS would not say why the ratings usually released in October around school report card pickup were delayed.

In August, CPS released NWEA scores, including growth, which measures the difference between the current test and a previous test to see how a student has progressed. Growth scores factor heavily in a school’s rating, affecting 45 percent of the total score that determines an elementary school’s rating from Level 1+ down to Level 3, and 30 percent for high schools.

CPS accountability chief John Barker said adjustments were made to “equalize” the scores of schools that either took the test in the fall instead of the spring in 2013. Or the schools took a version of the NWEA MAP test in 2013 different than the Common Core-aligned version taken by the bulk of district-operated schools.

So, Barker said, CPS put the differing schools’ scores through the same calculation in order to put everyone on the same metric as the new accountability system intended.

“There’s nothing nefarious here,” Barker said. “In terms of us wanting to get it right, it took time for us to sort through those issues.”

The affected schools were notified, Barker said. He would not say why the public wasn’t informed. He said the problem won’t repeat now that every public school took the same version of the MAP in spring  2014 and will again in spring 2015, too.

LaRaviere said the district shouldn’t have changed data but instead flagged the ratings of schools that weren’t on the same timeline, indicating a difference.

The chief of strategy for Chicago International Charter School, which saw ratings for four of its schools rise, agreed.

“I would have preferred the asterisk. We essentially asked for an asterisk at one point,” Daniel Anello said.

CICS had always used the test that wasn’t aligned to the Common Core through the school year to measure their students’ learning, so they saw no reason to change it while it remained an option, he said.

Anello said when CICS and other charters told CPS they didn’t take the same test, “CPS ultimately opted to do the back calculations.”

He said the district took its overall average for how much kids drop off over the summer, and applied that average to each school.

“If anything, it probably hurt charters mathematically because the charters have typically longer school years,” he said, and therefore a shorter summer in which to forget what they learned.

NWEA Calculations