Education Project at Ruskin & Associates The Education Project

Home
Ruskin & Associates
Solutions for Risk Exposure®

Home
John Ruskin

New Orleans, Louisiana 70118
(504) XXX-XXXX
eMail: JohnRuskin (at) ComplianceOfficer (dot) Com


September 28th, 1999

The Gambit

New Orleans, Louisiana

Sir:

Following the publication of the T/P article on school performance, I spent some time reviewing the numbers provided the State Board of Education, particularly, their "School Score" and their "Poverty" percentages.

I have appended to this cover letter: some graphs and data; and, a "Letter to the Editor". The graphs were produced using a spreadsheet program (QuattroPro) using the data you provided in the newspaper of the approximately 260 schools. To the best of my knowledge, the graphs accurately represent the data you published.

The graphs are relevant because they highlight the extraordinary(1) relationship between "poverty"(2) and the "scores", and raise further questions, not answered by the State Board.

The graphs I am giving you are titled:

1. Rank vs Poverty-Score: All Area Schools

2. Poverty vs Score: All Area Schools

3. Poverty vs Score: [Schools] With Elem[entary school] Students [- All Schools]

4. Poverty vs Score: Orleans Elem[entary School] Students

5. Poverty vs Score: [All] Orleans Students

6. Poverty vs Score: Non-Orleans students

I am also providing you with a printout of the background data which produced these graphs.

Some Notes:

a. Graph 1 states the obvious: that as rank goes down, scores drop. It also demonstrates that rank and score are related to poverty

b. Graphs 2-6 relate "Poverty" and "Score"

c. Graphs 2-6 have a linear fit produced by the QuattroPro. Produced by a least squares calculation, it visualizes a trend. Variations of actual data away from this line show only that the linear model is not accurate [that is, that there is not an exact linear relationship between "poverty" and "score"]

d. Graphs 2-6 are scaled alike; you can hold them up to a light for comparison. Variations in the scaling would minimize or maximize visual impact of the trend

There are questions that I address in my "Letter to the Editor", which follows, questions which I think you should follow up on. These include:

1. Why hasn't anyone compared these factors [poverty and score] on a student by student basis? For a better definition of "poverty", a replacement for that, in fact. For each student, a ranking could be obtained by grading such factors as family income; a stable family address; a stable household with 1 or more parents; that student's parent's involvement in school, PTA and/or homework; an evaluation of parental literacy. Add a few more factors suspected to degrade or improve student performance: class size, number of disruptive students in that student's class, and you end up a pretty broad picture of what actually affects a student's grade. If, with other factors being equal, a student's school site tends to degrade that student's "score", then, and only then can you safely say that a particular school [with its specific collection of local management and staff] is bad.

2. With the data presented by the State Board, why hasn't anyone even asked: "since poverty is so clearly related to school scores, what is it that causes subtle variations in scores for schools with similar "poverty" levels? Consider these:

a. Some schools have a disproportionate number of students who are either disruptive or special education students

b. Some schools have special admission policies.

c. Some schools, and its principal and teachers, over or under emphasize "teaching the test", rather than teaching the student, and rather than providing each student the maximum opportunity to learn.

3. Many schools in the ranking, outside of New Orleans, have multiple grade levels. There is no mention of the fact that the LEAP and IOWA tests, like many standardized tests given by the school system over the years, are given in specific (and different) grade levels. Each of these tests have different reliability and scope -- some tests evaluate rote learning susceptible to bias by teaching the test. Yet, from school to school in the ranking, there aren't the same tests because the school grades don't match from school to school. Despite this, extraordinarily, the school scoring system compares K-6 schools with schools with lower elementary only with schools with grades K-12. Teaching failures of some schools in some grades would be masked by the performance in other grades.

4. What steps have been taken to "improve" scores, particularly in the elementary schools . . . ask teachers and principals, school volunteers, and you will discover that extraordinary means are taken in some schools: practice tests with similar or identical questions(3); teachers who [at the suggestion of the principals and on their own] assist students class wide(4) by reading questions aloud or "suggesting" that a student recheck a particular question.

These questions, and many more like them, are unanswered, and render the evaluation system chosen by the State Board to be biased and unfair, and, most importantly, useless. They are asking the wrong questions.

Oddly, the race to raise scores, and the untoward means by which staff attains this goal, serves only to take "marginally" poor performing schools out of the poorest ranking, eliminating the possibility of temporary money, support and focus that an "unacceptable" school receives.

Personally, I think the media should more closely evaluate these factors. Besides me, there are a slew of professionals [teachers, university faculty, experts in statistics everywhere] that could point out the failures of this ranking.

I think you need to raise these issues, and speak out editorially against the means by which the state board is grading schools, the misuse of standardized tests, and the errant focus on scores to evaluate school performance [in lieu of student performance].

Sincerely,

John Ruskin
B.S. / M.S., engineering, J.D.



(1) Obvious . . . and un-emphasized.

(2) Why the State Board of Education chooses "free lunch" as a measure is not addressed.

(3) Limited vocabulary sets within the lower grade test regimens facilitate pre-test training.

(4) An aberration, perhaps, of the permitted assistance granted to certain students, which include, I believe special ed. students.

John Ruskin

New Orleans, Louisiana 70118
(504) XXX-XXXX
eMail: JohnRuskin (at) ComplianceOfficer (dot) Com


October 18, 1999

To the Editor

The Gambit

New Orleans, Louisiana

Re: School "scores" for the New Orleans metropolitan area, provided by the State Board of Education.

I am disturbed by the implications the data provided by the State Board of Education on school performance. I am also angry that the State Board, prodded by the legislature, has chosen a biased, unfair and pointless tool for evaluating schools.

Two subtle questions loom within the State Board's figures. #1: Does poverty correlate with school "scores"? Yes, sadly, "poverty" correlates with school "scores". It is clear from the data that there is an extraordinary relationship between school "scores" and "poverty"

#2. Is Poverty a cause of poor school performance? This is a misguided question, because they do not measure school performance. 90% of the "score" is based on standardized tests. Yet, unfortunately, many factors render standardized tests unreliable for the purpose of evaluating schools, using tests designed to evaluate students.

What is "School Performance"? It is rooted in the obligation of a school to provide each individual student the maximum opportunity to learn. The State Board, irresponsibly, equates student performance with school performance. It is irresponsible because schools can not guarantee success, the public should not be trained to expect success from schools faced with the factors which correlate to poor student performance.

To test this thought, I propose a mind experiment. We know that "poverty" correlates to school "score", but can the school affect student performance? Take two elementary schools, with similar school student counts, but one "scoring" unacceptable and the other "scoring" admirably. What would happen if we transplanted the two entire schools: the school building, the principals and teachers and staff, the equipment and textbooks, the chalk and the school bell. Leave where they are the students, the families and the PTA. If the State Board is right, the "scores" for the two schools will be the unchanged. The children in "poverty" will walk to their new, neighborhood school, and, magically, by the end of the year, their test scores will rise.

This is lunacy.

I suggest two alternative questions, where, with some effort, the statistics are available and can be properly applied.

#1: As poverty is related to student scores, evaluate how the elemental issues of poverty effect to individual student's performance -- move from correlation to causation. Knowing those factors, work to eliminate them.

#2: Since poverty is so clearly related to school scores, what is it that causes subtle and not so subtle variations in students' scores between schools with similar "poverty" levels, rich or poor? I suggest that we consider these, among others, as possible explanations, for the State Board's data: class size; the proportion of disruptive or special education students; special admission policies; or, the degree to which principal and teachers over or under emphasize "teaching the test". Take advantage of the variations and improving learning opportunity.

Instead, the State Board and its mignons will descend on the "unacceptable" performers, creating more plans, to follow years of plans and curriculum changes. The bureaucracy and the implementation will be stunted and unfunded. Sadly, as the plan now stands, in a few years, when the schools are closed, or when parents are given the choice to leave these schools, we will face several disasters. If closed, or if parents are vouchered in or out of the school system, we will not have alternatives with space and funding. And, horribly, we will leave 50% of the student population in those poor schools, for reasons which range from disinterested parents to interested guardians unable to transport their children around the city.

It is good to suggest, and then evaluate, what factors affect student performance and school performance. Oddly, we don't ask those closest to and most responsible for education -- teachers. Ask any inner city teacher, regardless of skill or interest in their job, what those factors are and within minutes they would identify them.

Parents who don't read to their children; class size; more than one or two B.D. or L.D. children in the classroom; excessive reporting and testing, including preparation; and parents or guardians who are not interested in their children's education, with or without cause, and whether due to their own illiteracy or because all parents work multiple jobs, or for no reason at all.

Sadly, these problems, so clear to teachers, do not present an easy solution. Their evaluation can't be pounded into a desk at the legislature or on a campaign trail; the public, collectively, is unwilling to commit the resources to a solution; and the solution is not within the school system.

Ideally, the public would fund more classrooms and teachers to reduce class size; it would fund extensive literacy programs; it would fund an expansion of welfare oversight to require parents to learn and to take an interest in their children's learning. The list is longer, but competes with a limited public interest. We suffer the consequences. Our city's children grow up uneducated, and a proportion of them turn to alcohol and drugs. A proportion commit crimes and we pay to build jails; our insurance rates go up; we can't attract industry and our tax base slides.

John Ruskin

New Orleans, Louisiana

This subdomain to ComplianceOfficer.Com is published as a public service by John Ruskin, in the hope that enlarging the scope of discourse on Public School issues will lead to broader support. All Comments are welcome, regardless of perspective, in keeping with first amendment principles in a public forum.

Ruskin & Associates — Providing effective operational and management planning for distressed or changing businesses, whether arising from catastrophic loss or fiscal and operational difficulties. This includes loss consulting, legal counsel and adjusting services.

How to contact Ruskin & Associates
eMail:
John Ruskin
New Orleans
     Home of the Best Food and Restaurants in the World!
Postal Mailing Address:

     mailing address

How to contact The Education Project
eMail:
Direct eMail to:

     The Education Project

Located at: