“They said our promises of rising standards were overblown, that effects would be, at best, negligible…yet again, the facts on the ground tell a different story.”
Michael Gove, 4 January 2012

The background

The Education Secretary this week accused opponents of his flagship academy schools project of “pushing the same old ideology of failure and mediocrity”.

The government is encouraging thousands of schools to switch to academy status, which means they get their funding straight from the government instead of a local council and have more control over the curriculum, the school day and pay and conditions for teachers.

Ministers are becoming increasingly frustrated with opposition from union leaders and some parents.

But is there any evidence that giving schools more power to run their own affairs improves academic performance?

The analysis

We’ve looked into academies in depth before, and already found that there are few simple answers.

But that didn’t stop Mr Gove making the claim that his promises of higher standards in academies have already come to pass.

He said: “In the 166 sponsored academies with results in both 2010 and 2011, the percentage point increase in pupils achieving five plus A*-C including English and maths was double that of maintained schools.”

That’s true, according to Department for Education (DfE) figures, but a dubious comparison, say academies sceptics.

Among that small group of “sponsored academies” the percentage of pupils achieving 5 or more GCSEs at grade A* to C including English and maths did indeed grow from 40.6 per cent to 45.9 per cent in 2011 – an impressive rise of 5.3 percentage points.

And across all “maintained” schools the increase was from 55.2 per cent to 57.8 per cent – a rise of just 2.6 percentage points.

Why choose to compare just those 166 sponsored academies? Part of the reason could be that they are the kind of academy that has tended to replace underperforming schools in deprived areas.

If you compare the exam results of an underperforming school to an average one, you are starting from a lower base, and it may be that the worse things are to begin with, the quicker they improve.

Why not compare like with like by looking at the results from academies with schools that have similar pupil demographics and levels of attainment?

Two studies have tried to do that: one by the National Audit Office (NAO), the other by academics Stephen Machin and James Vernoit of the London School of Economics.

Both studies conclude tentatively that there is some evidence that some academies may be delivering GCSE gains faster than similar schools.

Messrs Machin and Vernoit found “significant improvement” in academies that had been open for at least two years, with an extra 3 per cent of pupils on average getting at least five GCSES at grades A* to C.

The NAO found a similar pattern, although the difference in the rates of improvement between various kinds of academy and comparable schools was pretty tiny, as the second graph shows:

There are other weaknesses common to all these comparisons.

As the House of Commons public accounts committee noted, there’s no way of knowing whether an academy’s success can be attributed to its newfound autonomy as opposed to other aspects of the school’s change of status.

“Any school that acquires a new building, a new head teacher and many new staff is likely to improve its pupils’ levels of attainment. Such changes can lead to an improvement in morale and behaviour that makes a school almost unrecognisable as the predecessor school, though many of the pupils are the same.

“It is therefore difficult to assess how far improvements in results in academies derive from the Academies programme itself and the features that make academies different from other schools, or from the high level of expenditure involved in opening an academy.”

And as Civitas has pointed out repeatedly, there’s a question mark whether the pupils we are comparing are sitting exams of comparable difficulty.

Apart from maths and English, what are the other three GCSEs or equivalents? Could they be less academically demanding subjects like IT, the think-tank asked? Were academies encouraging too many pupils to do “pseudo-vocational” subjects in order to climb the league tables?

There were signs that such fears were justified last summer when analysts from the Times Educational Supplement, ¬†among others, noted that academies tended to do worse than other schools on the “English Baccalaureate”, a new measure marking results in core academic subjects: English, two sciences, maths, history or geography and a language.

This was not mentioned by the Department for Education when it issued its official press release about GCSE results.

The verdict

A dose of healthy scepticism appears to be needed whenever ministers seek to use statistics to prove the supposed superiority of the academy model.

On the other hand, it’s not entirely fair to say, as several union leaders did this week, that there is no evidence whatsoever that academies are increasing performance.

Much of the academic research done so far has tended to show positive results, including the counter-intuitive effect noted by Machin and Vernoit where schools that were geographically close to successful academies saw their results improve as well, even though the academies tended to poach the “best” pupils in the area.

What we still don’t know is whether the improvements will continue in the long term, whether they will be felt in academies that were already academically successful, whether the effect has anything to do with increased autonomy, and whether the whole project is cost-effective.

By Patrick Worrall