One Question Quiz
Photo: Getty Images
Photo: Getty Images

OPINIONSocietyMarch 12, 2020

How well is a school really performing? We built a lab to find out

Photo: Getty Images
Photo: Getty Images

We wanted to find out how schools were doing, irrespective of the privilege of their students. So we created a new measure to assess it, and we’re urging the Ministry of Education to pick it up, writes Eric Crampton of the New Zealand Initiative.

If your school has strong NCEA results, is it because it’s performing well, or because it serves a lot of children from more privileged backgrounds?

Schools with identical NCEA outcomes could have wildly different performance if one of them got there through extraordinary efforts to overcome the disadvantages that its students brought with them to class.

Parents, school boards and principals deserve far better measures of how their schools are doing. Principals need to know whether the initiatives they try are working. The Boards of Trustees managing our largely self-governing schools must know how to assess their principal’s performance, so knowing how the school’s results measure up matters. And parents who vote on their school’s Board of Trustees need better ways of checking how things are going.

Current measures aren’t up to this task. Not only do they hinder effective school governance, they can also result in parents ignoring the excellent school across the road in favour of a worse one further off but with a higher decile ranking.

But we can do so much better.

We know this because we have built the measure and provided model reports for the three schools who asked for them. The reports show how each school is doing after removing the effects of things outside a school’s control: a measure of how well the school is doing for the community it serves.

And there is no good reason the Ministry of Education could not provide similar reports for every secondary school in the country. All parents need to do is ask.

Let’s explain how the measure works.

For the past two years, The New Zealand Initiative’s analyst Joel Hernandez has been secluded in his own little quarantine facility: the Statistics New Zealand data lab tucked away in a Wellington office tower. There lives the Integrated Data Infrastructure (IDI) – or at least the terminals to access it. The IDI is well-guarded and getting permission to do research work in it is challenging.

Those hurdles are warranted.

The IDI, a beautiful little project instigated by then finance minister Bill English, links together the government’s administrative databases where information about every New Zealander appears, in anonymised form, whenever we interact with the state.

Every grade awarded in every NCEA subject taken by every student going back over a decade is there, along with the school each student attended.

Students can be linked back to their families to build a more comprehensive picture of the student’s background. The education of each student’s parents is there, along with their income, benefit history and police and prison records. Abuse notifications held by Child, Youth and Family are also there – and more.

These anonymised links across the different databases allow for a more thorough picture of the circumstances a student brings to the classroom. A school serving many students in a poorer community for whom English is a second language will face different challenges than a school teaching the children of university lecturers.

The linked data allowed us to use standard linear regression techniques to correlate each of those independent factors, as well as the school the student attended, to each student’s NCEA outcomes.

You can think of the method as showing us how well a school is doing compared to schools serving very similar communities, or as providing a measure of performance stripping away the advantage or disadvantage students bring with them.

We released the first set of our results last year. Unadjusted figures look a lot like standard NCEA league tables. But adjusting for the differences in the communities that schools serve shows most schools performing comparably to each other, with a few outliers at the top and bottom. Substantial numbers of low-decile schools are star performers whose work goes unrewarded by NCEA league tables. Weaker-performing schools are spread across the deciles and a lot of high decile schools are middling-achievers when the advantages they enjoy are accounted for properly.

But the same restrictions in the IDI that rightly protect individual confidentiality are also applied to schools. When we presented our results then, we were not even allowed to provide a scatterplot showing each school as an anonymous dot. We had to resort to second-best ways of presenting the results.

After the release of the 2019 report, three schools approached us to ask how they were doing. Their willingness to allow their results to leave the lab let us do something a bit more interesting – after extensive discussions with Stats NZ on just how that could be achieved.

For those three schools we produced the kinds of reports the ministry could create for every school. These show the schools’ performance at NCEA Level 1, Level 2, Level 3 and at University Entrance, before and after adjusting for the communities each school serves. They show how performance at each school has evolved over time and how well they do for their different communities.

This week, we released those three school reports to show parents, Boards of Trustees and principals the kind of reporting they should be able to get from the ministry. Our organisation does not have the capacity to produce reports for each school – arranging the permissions with each school would be a job all on its own.

The ministry has about 3,000 staff. We have Joel. But all our code is open to any researcher with access to the IDI and permission to use those linked databases. The ministry can build on our work.

Getting this information to parents, principals and school boards would be transformational. In earlier research, we saw a persistent characteristic of schools drawing poor reviews from the Education Review Office, year after year, was school boards unable to hold their principals to account. There are not a huge number of them, but too many principals can shrug off poor performance by blaming the difficult circumstances their school faces – even when they are actually doing poorly despite those challenges. In other cases, the very real achievements of principals and schools deserving of high praise are missed by blunt and misleading NCEA league tables.

Self-governing schools will simply work better when boards have better information on how well a school is faring.

Every year, schools spend countless hours collating and inputting data for the ministry. It’s about time it provides something with a bit more substance. But it will not happen unless parents and school boards ask for it. There is a lot of inertia to overcome.

Dr Eric Crampton is chief economist with the New Zealand Initiative

Keep going!