The SchoolBoard Project
September 27th, 1999
The Editorial Page Editor
"Letters to the Editor"
New Orleans, Louisiana
Following your publication of the article on school performance, I spent some time reviewing the
numbers provided the State Board of Education, particularly, their "School Score" and their
I have appended to this cover letter: my "Letter to the Editor", and; some graphs. The graphs
were produced using a spreadsheet program (QuattroPro) using the data you provided in the
newspaper of the approximately 260 schools. To the best of my knowledge, the graphs
accurately represent the data you published.
The graphs are relevant because they highlight the extraordinary relationship between "poverty"(1)
and the "scores", and raise further questions, not answered by the State Board.
The graphs I am giving you are titled:
1. Rank vs Poverty-Score: All Area Schools
2. Poverty vs Score: All Area Schools
3. Poverty vs Score: [Schools] With Elem[entary school] Students [- All Schools]
4. Poverty vs Score: Orleans Elem[entary School] Students
5. Poverty vs Score: [All] Orleans Students
6. Poverty vs Score: Non-Orleans students
I am also providing you with a printout of the background data which produced these graphs.
a. Graph 1 states the obvious: that as rank goes down, scores drop. It also
demonstrates that rank and score are related to poverty
b. Graphs 2-6 relate "Poverty" and "Score"
c. Graphs 2-6 have a linear fit produced by the QuattroPro. Produced by a least squares calculation, it visualizes a trend. Variations of actual data away from this line show only that the linear model is not accurate [that is, that there is not an exact linear relationship between "poverty" and "score"]
d. Graphs 2-6 are scaled alike; you can hold them up to a light for comparison. Variations in the scaling would minimize or maximize visual impact of the trend
There are questions that I address in my "Letter to the Editor", which follows, questions which I think you should follow up on. These include:
1. Why hasn't anyone compared these factors [poverty and score] on a student by student basis? For a better definition of "poverty", a replacement for that, in fact. For each student, a ranking could be obtained by grading such factors as family income; a stable family address; a stable household with 1 or more parents; that student's parent's involvement in school, PTA and/or homework; an evaluation of parental literacy. Add a few more factors suspected to degrade or improve student performance: class size, number of disruptive students in that student's class, and you end up a pretty broad picture of what actually affects a student's grade. If, with other factors being equal, a student's school site tends to degrade that student's "score", then, and only then can you safely say that a particular school [with its specific collection of local management and staff] is bad.
2. With the data presented by the State Board, why hasn't anyone even asked: "since poverty is so clearly related to school scores, what is it that causes subtle variations in scores for schools with similar "poverty" levels? Consider these:
a. Some schools have a disproportionate number of students who are either disruptive or special education students
b. Some schools have special admission policies.
c. Some schools, and its principal and teachers, over or under emphasize "teaching the test", rather than teaching the student, and rather than providing each student the maximum opportunity to learn.
3. Many schools in the ranking, outside of New Orleans, have multiple grade levels.
There is no focus on the fact that the LEAP and IOWA tests, like many
standardized tests given by the school system over the years, are given in specific
grade levels. Each of these tests have different reliability and scope -- some tests
evaluate rote learning susceptible to bias by teaching the test. From school to
school, there aren't the same tests because the school grades don't match from
school to school. Despite this, extraordinarily, the school scoring system
compares K-6 schools with schools with lower elementary only with schools with
grades K-12. Teaching failures of some schools in some grades would be masked
by the performance in other grades.
These questions, and many more like them, are unanswered, and render the evaluation system
chosen by the State Board to be biased and unfair, and, most importantly, useless. They are
asking the wrong questions.
Personally, I think the T-P, and other papers, should more closely evaluate these factors. Besides
me, there are a slew of professionals [teachers, university faculty, experts in statistics
everywhere] that could point out the failures of this ranking.
I think you need to raise these issues, and speak out editorially against the means by which the state board is grading schools.
s/ John Ruskin
B.S. / M.S., engineering, J.D.
Footnote 1. How the State Board of Education comes up with this figure is not addressed.
This subdomain to ComplianceOfficer.Com is published as a public service by John Ruskin, in the hope that enlarging the scope of discourse on Public School issues will lead to broader support. All Comments are welcome, regardless of perspective, in keeping with first amendment principles in a public forum.
Ruskin & Associates — Providing effective operational and management planning for distressed or changing businesses, whether arising from catastrophic loss or fiscal and operational difficulties. This includes loss consulting, legal counsel and adjusting services.
Home of the Best Food and Restaurants in the World!
|Postal Mailing Address:
The SchoolBoard Project
|The information you obtain at this site is not, nor is it intended to be, legal advice nor an implication of certification. Please consult an attorney for individual advice tailored to your own situation. Read more about that.|
|All Content Copyright © 1997-2012 John Ruskin. All rights reserved.||Top SchoolBoard subHome Way Home...|
|Most recent modifications to portions of this site: 06/07/12 19:15. Web Design Theology|