"Unfair Comparisons in Nation's Schools." By William L. Bainbridge. New York Times. February 2000.
UNFAIR COMPARISONS IN NATION’S SCHOOLS
by William L. Bainbridge, Ph.D.
A few weeks ago, while serving on a panel for a community meeting in Sarasota, we heard arguments supporting the county school board’s application to the State of Florida to become the nation’s first major charter school district. If approved, Sarasota would be freed from many state testing and data reporting requirements and other rules and regulations. The district would continue to assure equity compliance, fairness to children with disabilities, bargaining unit commitments, and transportation availability for students. The implications for accurate comparisons of schools in the district are far reaching. Sarasota has set a course to be among the first school districts in the nation to shift power from state politicians to school teachers and principals.
Although Sarasota County is in the 90th percentile in the country and highest in Florida on family income level, the charter proposal is not an attempt to avoid sharing the district’s tax revenues with other districts in Florida. Financial equity with other Florida districts is part of the enabling legislation already passed by the state.
Leaders of the district have experienced a good deal of frustration with state testing and reporting results. In fact, following the meeting, several people asked why a former state education agency cabinet member and superintendent of three local school districts, like me, would support this move to permit a school district to sever its ties with the state. My immediate focus was on the issue of state required reports and unfair comparisons of schools and school districts.
Those of us on the panel, including the Florida Deputy State School Commissioner, a representative of the US Office of Education, a university provost and the local superintendent, offered little evidence that state education agencies have much to offer "lighthouse" school districts like Sarasota. The burdensome number of reports required by the state alone seems to outweigh the benefits.
Many recent visits to school systems suggest to me that one of the most serious problems confronting education today is providing the public with accurate comparisons of school opportunities for their children. The problem of comparing schools for parents has become more complex due to misinformation and disinformation from a variety of sources, including state agencies and the school districts themselves.
Given such problems, it makes sense a district like Sarasota would want to escape the limitations that state mandated reporting structures impose. The states, however, are only part of the problem, as illustrated by the following examples:
- John J. Cannell, M.D., proved the unreliability of self-reported school data when he surveyed schools across the country at his own expense. Cannell reported to the national media "…over 90% of the nation’s schools said their students were scoring above the national average," obviously mathematically impossible.
- Many states have designed homemade proficiency tests geared to some mysterious standard developed by committees of educators with little substantive supervision from professional testing experts. A recent fourth grade proficiency test indicated that only four of Ohio’s over-600 school systems could boast that over half their students even passed the test. The irony of this is that common sense and history tell us that many of the students in the failing districts will have far higher college entrance examination scores six or seven years from now than the average of those who constructed the test. When students in systems like Bexley, Granville and Upper Arlington have problems with the test, there is probably something seriously wrong with what is being tested or how the test is designed. These students annually rank in the top one percent in the nation on both college entrance examinations and norm-referenced tests.
- Most states compare school districts unfairly using data, such as total corporate and individual tax base per pupil, which has been shown to have virtually no relationship to student learning outcomes. Pediatric neurologists have proven that formation of synaptic contacts in the human cerebral cortex are greatly improved when small children have significant amounts of protein in their diet and stimulation through activities such as music, computers and parental attention. Educators have shown a high correlation between a mother’s education level and the child’s ability to learn in school. Both groups conclude that students from high socio-economic homes have great advantages in doing schoolwork. Regardless, few state agencies have attempted to address these issues.
- Some states, including Ohio, are conducting contests to determine which low performing districts receive state assistance. The "Ohio Reads" program recently included a requirement that the administrators in districts with low scores submit creative applications in order to receive help. Many districts experiencing low scores in reading were excluded because the selection committees did not like the applications submitted by their administrators. Let’s see. Are we saying we should punish students who already have low reading scores because their administrators are not good creative writers? Or maybe the biases of their school leaders don’t match those of the selection committee, and, therefore, we should punish the students.
- Although the US has the world’s best system of higher education, the information sources used for selecting colleges or universities don’t set a good standard for public school information. Most of the so-called guides sitting on bookstore shelves have some form of selectivity rating. Interestingly enough, the ratings tend to be clustered between 60 and 99. Since most people think of scales from 0 to 100, ratings of 60 for the academically weakest universities could really mislead parents and students. One of these guides gives a "selectivity rating 99 of 100" for CalTec, Harvard, MIT and Yale, the most selective universities in the world. It begs the question, "If they’re 99, who’s 100?"
- In the interests of "helping" clients, several national real estate web sites are collecting material from school administrative offices and drawing some of the wildest comparisons imaginable. A classic case in New Jersey involved a real estate firm averaging SAT with ACT scores. Of course, the scales are entirely different and the results absurd. An Associated Press wire story quoting from the Chicago Tribune, with the headline, "Net site found to supply false data," reported that Realtor.com had provided "… inaccurate ratings of school systems across the country."
- The federal government’s efforts to help provide school comparisons are seriously flawed. The National Center for Education Statistics has little authority to compel states, school systems and private schools to standardize data. Consequently, NCES has developed little expertise to collect, audit and report school data for reasonable comparisons. For example, state education agencies in Michigan, Massachusetts and Connecticut have been known to fail to comply with federal requests for data ranging from test scores to teacher salaries. The National Assessment of Educational Progress is voluntary and includes only a non-scientifically selected sampling of districts.
Breaking free of the state system may seem risky, but for the many districts hampered by flaws in this current system, it may be a necessary step. Parents and other consumers need to be consistently vigilant in their demand for accurate school information. If more school districts and parents examine the assumptions behind the data, the results will be more reasonable.
Dr. Bainbridge heads SchoolMatch, a national research firm located in Columbus OH, assisting corporations with school data and consulting services.