Getting an "A" is good. Getting an "F" is bad.
Unless the grades are for school systems in Ohio in the fall of 2013. For now, they're potentially kind of meaningless.
Local schools didn't fare too well under the state's new grading system, which replaced the former rating system that had been in use for several years. Under the previous system, schools were rated as excellent, or under continuous improvement or needing improvement. Those ratings were, state officials contend, not specific enough, not as easy to understand as grades.
So, new indexes were developed to rate schools on some new parameters, not just the number of students who become proficient in certain educational areas. The new system factors out the dependence on attendance and focuses on academic growth from year to year.
"We feel that's easier to understand," John Charlton, spokesman for the Ohio Department of Education, said. "People understand what getting an 'A' means, they understand what an 'F' means, so the ratings are more understandable."
However, there's the problem. With the system just debuting, the 655 Ohio school districts didn't have time to really focus on what the new stats would mean. And thus, there are statistics that don't look good, though the schools were meeting the state's former rating system quite well.
It's raising the bar, state school officials contend. But it will be a year or two before the new state report cards give a real measure of how good a job they're doing educating students.
Charlton, though, said there is no overall rating this year. That won't come until 2015, when state officials figure everyone's had time to adjust to the new system "and focus their efforts on being successful in all areas that are being measured."
State Superintendent Richard Ross agreed comparisons are difficult without an overall grade, and that's just not available at this time.
"(The public) needs to understand that if a school or district gets a lower grade than expected, that doesn't necessarily mean students got a poorer education there than they did the year before," he told the Associated Press recently. "But what it does mean is that the school and district will have to work to meet new, higher expectations."
This year's report cards still grade so-called performance indicators measuring how many kids meet minimum proficiency levels of knowledge. "It only measures kids who are proficient," he said.
There's also a graded performance index that measures the achievement of every student, not just whether or not they reach "proficient," and awards points to the school district. The better students do, the more points their school receives.
And there's a progress section that analyzes data to gauge what kind of job schools are doing. Because it compares up to three years of data, schools aren't penalized if students do badly in any particular year, the theory being that there may be a "great deal of academic growth taking place moving students toward academic success."
And that's all well and good, except we will again contend that making schools simply teach to achieve certain statistical goals fails to deal with the individuality of the student, the chemistry of student and teacher, and the size of the district.
Take districts like Youngstown and Warren. They could hire the best teachers in the nation and it would not measurably alter their scores. That has nothing to do with school funding, unions or other familiar issues; it has to do with poverty, joblessness and crime.
The state Department of Education said there were tens of thousands of third-graders who couldn't read at a third-grade level last year, and thousands more who had to take remedial courses upon reaching college.
We're left wondering if that is a matter that will be improved by changing the statistical yardstick by which schools in Ohio are measured. Certainly not in Youngstown and Warren, where improvement won't occur until there is a concerted effort to resolve problems in neighborhoods and families.