Misleaguing Tables

International comparisons are ‘bedevilled with difficulties’

(Boris Johnson, 30/04/20)

Throughout this…wait for it…unprecedented crisis, scientists and politicians are fond of following the facts and deifying the data. Decisions are taken less from a political or ideological point of view, but in an emotionless state of being, coldly analysing the impact on deaths and, for those who survive, on livelihoods.

We look at the charts, tables and graphs displayed each day and sink back gratefully into our chairs at home, comforted that decisions are taken in the light of hard evidence, not political whims. The daily briefing sometimes reminds me of an Open University programme, with its cardboard charts and bespectacled scientists. Not a powerpoint in sight.

But look closer when the data is used for comparative purposes. This is where things become a bit more opaque.

On the face of it, we could compare the impact of political decision-making, through looking at the amount of infections, and the amount of deaths. Surely, it’s a causal link, we say. Those with later, and looser, lockdowns must have suffered a greater number of deaths and infections. All very simple.

Except for the UK, where we’re told to be very careful in making any causal links. Because for us, it doesn’t look good.

The mortality table puts us second to the USA, and therefore strenuous efforts are made to remind us that it is dangerous to compare data between different countries. The culture of countries is so different, don’t you know that? For example, Scandinavian people live in a form of lockdown all the time so it’s easier for them, whereas Spaniards and Italians spend most of their social time on the streets hugging and kissing each other.

Some countries count care-home deaths in their figures, some don’t.

Other countries don’t have the systems to be able to record the data accurately, so it’s impossible to know the true extent of their infections. I could go on.

The conclusion being drawn by most of the politicians and social scientists is that it’s impossible to compare across countries that are so… well…different.

Bingo!

Ah, but what about education?  For when it comes to educational systems, we do this quite happily. We compare to our heart’s content, through tables such as PISA (Programme for School Assessment). Not only that, we change national policy as a result. Our obsession with the Far East’s educational success (questionable) has led to all sorts of changes, which have yet to provide any evidence of a causal link to alleged UK successes over recent years.

Has anyone actually checked to see if the PISA or TIMSS (Trends in International Mathematics and Science Study) data is actually useful in comparing country against country? From being the paragon of progress, people are now talking China down. Eyebrows are being raised. Did anyone ever question their PISA data, in the same way they are now questioning their mortality data? Did anyone ever realise that thousands, maybe millions of children weren’t included in their data? No, instead we adopted policy to back up our own ideological pre-conceptions. I’m not sure that PISA provides any causal link whatsoever, and the OECD do a lot to remind people of this.

And don’t get me started on UK-wide comparisons. It’s a bit rich for all these liberal metros (Andrew Adonis the latest) to be now SO worried about our most-disadvantaged pupils losing ground, when throughout the last ten years, the malodorous performance tables have (to use a current metaphor) wrestled us to the ground and stamped on those of us working in areas of multiple disadvantage.  

Every August, with a heavy heart and an angry head, I spend a few days preparing a set of alternative figures, charts and tables just to show how our proportion of SEND pupils, now approaching a third of the school roll, affects the raw figures nefariously. Schools with elite and selective admissions policies have no such problem, nor those that wantonly ‘off-roll’ and exclude SEND pupils. Such nuances are not picked up in ranking tables.

Are we really sure that all areas of the country are the same, in the same way that Italy and Indonesia are, quite obviously, not the same. Go to Blackpool and then to Blackheath and tell me that they are the same. And yet we compare their schools in the same tables?

Comparative tables serve no purpose but to dissuade any school leader and teacher from working in the bottom quintile of school deprivation. Recruitment becomes difficult and turnover is high. Factors such as these really affect the disadvantage gap, now such a huge concern for our liberal elite in their cossetted lockdown, wittering on about vulnerable children they have never met.

Rather than publish these meaningless and crass tables, how about sharing best practice across countries? How about investing in professional development, which is one thing China, Singapore and Finland definitely do – now there’s a trend worth following.

Er, no, Let’s leave the EU instead. That’s far more constructive.

We’ve seen now, throughout this horrible crisis, how tables, charts and graphs do no such thing as provide a causal link and provide accurate food for policy –making. Quite the opposite.

As a comparative tool in education, it’s time we wrestled them to the ground and locked them up.

3 thoughts on “Misleaguing Tables

  1. excellent piece – as a previous principle in a Liverpool community college i have often stated to Ofsted etc you cannot even compare on neighbourhood/community to the other never mind regions and countries – you can prove anything you like with graphs and charts

    Like

Leave a comment