Page Content
Larry Booi, ATA President
The Fraser Institute's second annual report on school rankings in Alberta was released May 5. The Fraser Institute, a Vancouver-based right-wing think tank, looked at more than 220 high schools in Alberta and ranked them on factors related to diploma exams.
The Fraser Institute's recent "rankings" of high schools are more than simply invalid. They are grossly unfair to the work of staff and students in Alberta's high schools. They are seriously misleading and point parents and others to conclusions that are unwarranted and distorted. Rather than contributing to a pool of useful information, these rankings actually undermine legitimate attempts to gain a clear picture of the education of young people in our high schools. The reasons are obvious when we take the trouble to look beneath the surface of their approach.

—Photo by Koni MacDonald
1. The basis of comparison for the rankings is extremely narrow and does not reflect society's expectations for the education of high school students.
The rankings are based on a 10-point scale comprising five factors, all of which are derived only from diploma examination courses (1. Diploma exam marks, 2. Diploma exam failure rates, 3. Diploma completion rates, 4. Difference between school marks and diploma marks, 5. Diploma courses taken per student). All of the other courses in high school (second languages, fine arts, physical education, career and technology studies) are completely ignored. As a result, Edmonton's Victoria School, internationally renowned for its fine arts program, ranks far down in the bottom half of the ratings, at 156th. And three Calgary schools (Jack James, Van Horne and Lord Shaughnessy) which focus on preparing students for the transition to work, placed at the bottom—despite their success in meeting their students' learning needs.
2. The five factors that make up the 10-point scale for the ranking are clearly inter-related and reinforce each other.
A school that scores high on the first factor (diploma exam marks) will almost certainly score high on the second factor (that is, low diploma exam failure rates), as well as on the third factor (diploma completion rates). Positive results related to the factor of "diploma courses taken per student" would also be expected in a high school with a strong emphasis on academic courses. Thus, while the rating is purportedly based on five factors, in reality it is based on five aspects of one factor—diploma examination courses.
3. The rating system incorrectly suggests that the 222 schools mentioned in the report are in a "fair competition," which involves a "level playing field." It is not a competition and the playing field is most certainly not level.
These schools have vast differences in their student populations, in some cases by design. For example, the "number one ranked school," Edmonton's Old Scona School, has a small enrolment entirely composed of students who were selected on the basis of their high academic achievement. Other schools serve large numbers of students who are academically disadvantaged or whose career plans do not involve going to university. Why would we expect these two groups to take equal numbers of diploma courses or to achieve comparable results on diploma exams? But they are all ranked on the same scale.
4. The rankings dramatically distort relatively small differences, with the result that some schools appear to perform far worse than they actually do.
The very act of ranking 222 schools means that 111 schools are going to be "in the bottom half" regardless of their performance. This system sets up some schools for the appearance of failure, regardless of the work that is done with students. For example, at Edmonton's McNally High School, which received a rank of 29/222, students received an average exam mark of 67.8 percent. Victoria School's students averaged 65.3 percent on their diploma exams, but their school ranked 156/222 (a difference of 127 in ranking, but 2.5 percent difference on examination results). At Queen Elizabeth School, the difference does a worse disservice to students and teachers: students averaged 58.3 percent on the examinations, but this coincided with a ranking of 214 out of 222. In other words, a school in the "top 30" and one in the "bottom 10" differed by less than 10 percent in their average examination marks, and yet they are at opposite ends of the ratings scale.
It is also instructive to note that the Fraser Institute uses average test scores, rather than the more meaningful figure of the percentage of students who meet provincial standards. This can only be seen as a deliberate attempt to diminish the results.
Teachers agree that schools should be accountable to the public and that parents and others have the right to information about how well students are doing and how well schools are meeting students' needs. But these are complex questions that require a broad range of information about a considerable array of factors. Above all, there is a clear need to examine issues that go far beyond numbers related to diploma examinations. The Fraser Institute's rankings are an attempt to reduce complicated questions to simple numbers and to foster competition in the mistaken view that this "market method" will lead to school improvement. Rather than contributing to an informed discussion, their misguided approach leads to unfair criticism of schools and teachers, who instead should be supported in their crucial work in educating Alberta's youth.
What's a high school worth?
The Fraser Institute's Second Annual Report Card on Alberta's Schools assigns each school with 25 or more students enroled in Grade 12 a mark out of 10. This mark represents the Fraser Institute's view of the worth of the school based on so-called academic indicators. There are five components of the score:
- the average of all of the diploma examinations in all diploma examination courses written by students in the school
- the number of diploma examinations which received a failing grade compared to the total number of course completions
- the average of the difference between the average mark obtained on the diploma examinations and the average school-awarded mark
- the average number of diploma examination courses taken by students enroled in the school
- the percentage of first-time Grade 12 students who completed graduation requirements that year
To obtain the rating out of 10, the Fraser Institute took the base year results (1995/96) and sorted the schools, on each indicator, from highest to lowest. The top 10 percent of results were assigned a 10; the next 10 percent a nine; and so on. The results from each subsequent year were then assigned a score which corresponds to the range of values established in the 1995/96 year. The assigned scores for each of the five indicators were then averaged to obtain a score out of 10. Given this rating system, it is extremely difficult to improve a school's rating. A school needs to move from one of the 10 ranges to a higher range. For example, if the range of scores were equal from 0 to 100, an average of 81 would be assigned eight out of 10. A score of nine would not be awarded unless the average came to 90. A score of seven would be assigned for an average of 79. This system applies for each of the five indicators.