RAISE reports are obsolete but their legacy casts a long shadow.
Those detailed documents of school performance were once over 100 pages long and even after a crash diet still reached 60 pages.
Later versions had a G stamped on pages believed to be useful for governors. And guess what? Nearly every page was marked with a G. As an LA data analyst, I trained governors on the dark arts of interpreting RAISE data, and it was painful. It wasn’t governors’ fault; it was the report.
The size of RAISE meant we couldn’t see the wood for the trees - but we clung to the belief that the numbers must mean something, so we ploughed on and reached our own conclusions. Governors and senior leaders set priorities and targets on the basis of a report that almost certainly wasn’t telling them what they thought it was.
RAISEonline was scrapped in 2015 and replaced by Ofsted’s Inspection Data Summary Report (IDSR) the first version of which was 22 pages long. Subsequent versions were trimmed down to 11 pages in 2018, and a mere 6 pages in 2019. What do we take from this? That, when it comes to school data, we can do more with less.
Pages of numbers have been replaced by criteria-driven statements drawn from a master list and - perhaps most striking of all - there is little or no data on pupil groups. Why? This was best explained in a speech given by Amanda Spielman, HMCI, in 2018:
“Nor do I believe there is merit in trying to look at every individual sub-group of pupils at the school level. It is very important that we monitor the progress of under-performing pupil groups. But often this is best done at a national level, or possibly even a MAT or local authority level, where meaningful trends may be identifiable, rather than at school level where apparent differences are often likely to be statistical noise.”
This is a key issue with school data: the groups are often too small, there is too much overlap between groups, and outliers skew averages. And yet governors still expect data to be broken down in that way. This is the legacy of RAISE.
Governors need a firm idea of what’s happening in the school but are not in the classrooms. They therefore rely on data to provide that picture. But what if the data is not reliable?
Beyond the issue of statistical noise there is the additional problem of bias and distortion. We know that the pressures of accountability can compromise the integrity of data, that using data not only to monitor and support pupils’ learning but also to report standards to governors and others can cause data to become bent out of shape.
As the final report of the Commission on Assessment without Levels suggests ‘School leaders should be careful to ensure that the primary purpose of assessment is not distorted by using it for multiple purposes’.
Avoid breaking the data down into numerous sub-groups. Instead, governors should be asking about the effectiveness of the support provided to meet the needs of pupils who are working below expectations or falling behind.
An update from the SENDCO or subject leader at a governors’ meeting is far more insightful than analysis of clumsy data.
Reducing learning to a set of neat numerical proxies is fraught with risks. We are liable to infer that any shifts in results reflect the quality of teaching and the curriculum – the school effect – which ignores the multitude of variables that are out of the school’s control.
Data is essential - and unavoidable – in schools; we just need to be honest about its limitations.
What governors need to know
- Inspection Data Summary Report (IDSR): Ofsted’s view of school performance and it’s vital that all governors get sight of it.
- Demographics (1 page): a brief summary of the school’s context showing percentages in key groups, attendance figures, and mobility compared to national figures. A by-year breakdown will explain why some cohorts may perform better than others.
- Results summary (2-3 pages): headline figures (key stage results) for the past 3 years compared to national averages to give an idea of trends.
- Current attainment overview (1-2 pages): percentages of current cohorts working at or above expectations compared to prior attainment, or on track for certain grades compared to targets. Focus on reading, writing and maths.
James Pembroke is a former data analyst in a local authority school improvement team and co-author of Dataproof Your School.