Skip to content

A close look at the numbers

At the last school board meeting on Feb. 21 Jennifer Ellis presented to the board a critique of the Facilities Report the district released in the fall.

At the last school board meeting on Feb. 21 Jennifer Ellis presented to the board a critique of the Facilities Report the district released in the fall.

Rosslander Doug McKay has spent many hours of his time analyzing this complex document, then he and Jennifer put together a Powerpoint presentation explaining all of the issues with the report.

The Facilities Report is a huge, 283-page, complex document that most people won’t take the time to wade through, and it helps to have a degree in multi-variate analysis to get through it.

Luckily, both Jennifer and Doug have significant experience with this type of analysis in the work they do, and have done all the hard work for us!

The following is Jennifer’s summary of the critique:

The Facilities Report analyzes and ranks 17 school configuration options based on 19 subjective criteria, such as minimizing capital costs over the planning horizon and maximizing educational opportunities for students. The top ranked configuration option, 7C, which includes the closure of RSS by 2013-2014, is proposed in the Report as the “Suggested Facilities Plan.”

The Facilities Report, while containing a large amount of interesting and useful data, incorporates a substantial number of biases and technical issues that make the conclusions regarding the final ranking of the 17 possible school configuration options questionable. This column just highlights some of the key examples of those issues. A more detailed list is available from the Neighbourhoods of Learning Committee.

The selection and weighting of the evaluation criteria themselves are heavily biased towards large schools. For example, maximizing educational opportunities and programs and services account for 28 per cent of the final scores for each option, while educational outcomes and keeping high school children in their own community are not considered at all.

The 19 criteria evaluated are not independent of each other and many use the same data, which means in effect that some data is being double and even quintuple counted.

For example, annual operating expenditure for each school, while an important piece of data, is counted four times in the analysis, which heavily penalizes older schools. Raw capacity data is counted five times. It is desirable to incorporate these types of data, but to count them more than once weights that data far beyond what was originally intended.

The manner in which points were awarded to certain options for various criteria is also questionable. For example, high schools with fewer than 39 students per grade received 0 points for educational opportunities, while those that had 200+ students per grade received 10 points for educational opportunities. Those with 150 students per grade received 6 points for educational opportunities.

While 39 students per grade may not be optimal for educational opportunities, it is not clear that it results in students having zero educational opportunities. Conversely, it is not clear that 200+ students per grade really creates that many more opportunities than 150 students per grade.

This scoring system again favours the options that include only large schools and disadvantages the options that include small high schools. Is there a point of diminishing returns whereby increasing class size no longer results in increased opportunities?

Likewise, the raw data (which is in various units — dollars, square metres, points) for each criteria must be converted to a common unit or score so it can be summed to arrive at total scores for each option. However the conversion system utilized disproportionately exaggerates the differences between the configuration options, making the higher ranking options score even higher, and the lower ranking options score even lower.

For example, for one criterion, the ‘worst’ option received 5.0 points, while the ‘best’ option received 6.9 points — only a 1.9 point difference. When those points were converted to scores, the 5.0 point option received 0, while the 6.9 point option received 10, meaning that a 38-per-cent difference in points was converted to a greater than 100-per-cent difference in scores.

As a result, many of the options, which were not that different from each other in terms of the raw data, seem much further apart in terms of their scores.

Finally, there were some questionable data assumptions, such as counting the revenues from the sale of closed facilities, which at this point in time cannot be sold according to the Ministry of Education, into the overall “cost savings” of options in which schools are closed, while at the same time failing to count the cost of new buses in those options.

Overall, the Facilities Report provides some interesting information — however the final numerical ranking of the options is based on some questionable assumptions and methodologies.

We encourage you to review the report carefully and see if you agree with the validity of the final ranking of the top four options — all of which recommend closing RSS.

We hope the trustees review the report with these issues in mind when making their decisions on the future of school facilities in this district.