Ohio students improved proficiency rates on 18 of 21 major state tests this spring, according to preliminary results released this week by the Ohio Department of Education.
The largest increases were on high school course exams in biology (up 9.6 percentage points) and American government (up 8.8). The only decreases in proficiency rates were on third-grade math (down 3.0 percentage points), third-grade English (down 2.1) and high school English 1 (down 1.5).
RELATED: Two state tests eliminated for 2017-18 school year
ODE officials said, though, that the decreased proficiency rates on the three tests were still higher than they were two years ago.
“The increases are generally a positive as we continue to have stability in tests and standards,” said Chris Woolard, senior executive director for accountability and continuous improvement for ODE. Woolard said he couldn’t pinpoint specific reasons for the increases.
ODE officials emphasized that the spring test data has not yet been verified by districts and is subject to change as first tests vs. retakes are sorted out. The state report card, with test score data for individual schools and districts, will be released around Sept. 15.
2017 STORY: Ohio schools get rare year without testing changes
Other preliminary test results that showed significant proficiency increases were seventh-grade English (up 7.15 percentage points), eighth-grade English (up 6.7), seventh-grade math (up 5.7) and fourth-grade English (up 4.9).
“These results are a general snapshot of how the state’s doing,” Woolard said. “We can’t use that to infer what any individual district’s results are.”
Brian Roget, ODE’s interim director of curriculum and assessment, said schools received their preliminary results in June and are in the process of putting them in their Education Management Information System to find missing records or scores that need adjustment or appeal.
2016 STORY: School superintendents rally against state testing
Roget said for those test questions that are released, schools will get student-level “item analysis” data to connect individual questions to the education standards they measure. That allows teachers and schools to see which questions their students struggled on and potentially make changes in how they teach subject matter the following year.