Content of the new RAISE report: The Good, the Bad and the Ugly

The DfE have now released details of the content and format of the new RAISE summary reports, to be published in the Autumn term. As expected, they are going to look considerably different to previous versions; and it’s not just the obvious stuff that’s changed. Of course, there will no longer be pages filled with percentages achieving level 4 and level 5, and progress matrices are consigned to history, but there are also big changes in the way data is put into context, most notably with new national comparators and prior attainment groups for both KS1 and KS2. The new KS2 progress measure uses a similar methodology to existing VA but scores will be in a new format; and colour coding – perhaps the thing we are most fixated on – has changed and now comes with a subtle twist. Some of these changes I like, some I’m not so sure about and some really bother me. Here’s a rundown of the good, the bad and the ugly.

The Good

Comparators
There appears to be a big change to the way performance of groups is compared. Previously, this has been a mess with results of key groups compared against national ‘averages’ for all pupils, for the same group, or for the ‘opposite’ group with no indication as to which is most useful or relevant. The analyses of performance of disadvantaged pupils was particularly confused with comparisons made against national comparators for disadvantaged pupils and all pupils in some parts of the report but, most critically, against non-disadvantaged pupils in the closing the gap section; a section that was – somewhat bizarrely considering its importance – tacked on the end of the report. This mess seems to have been addressed in the new reports so we should now be more aware of the relevant benchmarks for each group. For example: boys, girls, no SEN, non disadvantaged and EAL groups will be compared against the same group nationally; disadvantaged, FSM and CLA pupils will be compared against the national figures for non-disadvantaged pupils (as per closing the gap); and the SEN support and SEN with statement or EHC plan groups will be compared against national figures for all pupils. OK, I don’t quite get the rationale behind the last bit. Either SEN pupils should be compared against SEN pupils nationally, or not compared against anything. At least in the interactive reports users will be able to switch the comparator to ‘same’ for all groups allowing a like-for-like comparison.

Prior attainment
Big changes here. One of my main criticisms of RAISE is the lack of progress data for KS1. Previously, schools were judged on attainment alone. The new RAISE reports, whilst not providing VA scores for KS1 (one of the many things I like about FFT dashboards), will put the KS1 results into context by splitting cohorts into prior attainment groups based on EYFS outcomes. So, the KS1 results for those pupils that achieved a good level of development in the foundation stage will be shown, presumably alongside a national comparator. There will also be a further breakdown of KS1 results for those pupils that were emerging, at expected or exceeding in the early learning goals for reading, writing, and maths. I know there are many who will disagree with this approach but too many schools are forced into producing this data themselves when KS1 results are low. The lack of KS1 progress data in RAISE has also resulted in a major dilemma for primary schools: to they go high to ensure ‘good’ KS1 results, or go low and gain more progress at KS2. Hopefully, with KS1 results now placed in the context of prior attainment, this pressure will ease somewhat.

We will also see new subject specific prior attainment groups for progress measures at KS2. For example, the progress in maths will be shown for pupils with low prior attainment in reading, or middle prior attainment in writing. I assume the definitions of these bands are simply low = W or L1, middle = L2, High = L3, which differs from the point thresholds used for overall prior attainment groups based on APS at KS1. This new approach is welcome as it goes a long way to addressing concerns about the new VA measure outlined here. Whilst the main VA model is based on KS1 APS and will therefore result in pupils with contrasting prior attainment in english and maths being grouped together, these new prior attainment groups will allow us to unpick the progress data and isolate the issues.

Subject data on a page
All information about each subject (i.e. progress score, average score, %EXS+, % high score, for the cohort and key groups) will be shown in one table, which is great news, because up until now it’s been all over the place. Previously, we’ve had to scroll up and down through about 30 pages of attainment and progress data to get the headlines for KS2, forcing us to create our own templates to compile the key figures. Hopefully now we’ll just need to refer to a handful of key pages, which will be very welcome.

A shorter report?
Reading between the lines here, I’m hoping we’ll have a slimmed down RAISE report this autumn. 60 pages was too much. How about 20 pages? How about just 10 and ditch the dashboard? Please let that happen.

The Bad

Comparing results of SEN pupils against those of all pupils nationally is certainly not great. That should be changed to a like-for-like comparison by default, rather than the onus being on the school to do this via the interactive reports in RAISEonline. Also, writing VA continues to worry me, but that doesn’t look like it’ll be changing anytime soon. I look forward to seeing the methodology but I’d rather it was ditched from progress measures. My other bugbear is percentages for small groups and it looks like that farce is set to continue. I don’t think there are many primary schools where percentages make any sense when you drill down to group level, even when the gap from national ‘average’ is expressed as a number of pupils. I would prefer analysis of group data to focus on average scores, but even that is flawed in that it can be easily skewed by anomalies. The presentation of data for small cohorts and groups needs some serious thought.

The Ugly

Sometimes we should be careful what we wish for. I have major concerns with the application and interpretation of significance indicators in RAISE and have called for a more nuanced approach. And now we’ve got one. First thing to note is that red replaces blue as the colour of ‘bad’. Many evidently aren’t happy about this but the writing was on the wall once red dots were used in the Inspection dashboard and closing the gap section of the RAISE report. Red is also used to indicate data that is significantly below average in FFT reports. The second thing to note is that the double meaning of the colour coding, introduced in the inspection dashboard, continues. Red can either mean data that is significantly below average or signify a gap from national average that equates to one or more pupils in percentage terms. The third thing to note is that we now have shades of red and green defined as follows:

Pale red: indicates that data is significantly below national average but not in bottom 10%; or denotes negative % point difference equivalent to a ‘small’ number of pupils.

Bright red: indicates that data is significantly below national average and in bottom 10%, or denotes negative % point difference equivalent to a ‘larger’ number of pupils.

Pale green: indicates that data is significantly above national average but not in top 10%; or denotes positive % point difference equivalent to a ‘small’ number of pupils.

Bright green: indicates that data is significantly above national average and in top 10%; or denotes positive % point difference equivalent to a ‘larger’ number of pupils.

There is some serious blurring of the rules going on here. A significance test is a threshold and a school’s results are either significant or they’re not. Yet this approach will no doubt result in language such as ‘very significant’ and ‘quite significant’ entering the ‘school improvement’ lexicon, despite the bright red or bright green boxes actually being defined by a decile threshold rather than being the result of an additional significance test (e.g. a 99% confidence interval).  It’s bad enough that people might talk in terms of degrees of significance; it’s even worse that people will apply the term to data on which no significance test has been applied. Inevitably, we will hear disadvantaged gaps being described as ‘very’ or ‘quite’ significant because they are assigned a bright or pale red or green box, which in other parts of the report indicates statistical significance. Here, however, they relate to a gap equivalent to a certain number of pupils, and the thresholds used are entirely arbitrary; they are not the result of a statistical test. So we have colours meaning different things in different sections of the report – some denoting significance and others not – and shades of those colours defined by arbitrary thresholds. There is too much scope for confusion and misinterpretation, and schools will be forced to waste precious time compiling evidence to counter a narrative based on dodgy data.

No change there then.

Subscribe to receive email updates when new blog posts are published.

Share this article

2 thoughts on “Content of the new RAISE report: The Good, the Bad and the Ugly

  1. Unknown
    on July 21, 2016 at 3:12 pm

    I'm pleased that you have identified the issue of small cohorts. Working within an LA with many rural schools, this is a real issue for many HTs. Trying to explain to prospective parents, the LA or indeed Ofsted about % in a rural school is not always a good way of making judgements about the quality of the provision. For example when only 50% of children passed the phonics screening check because we had 6 in a cohort – one child with an EHCP and two summer birthday boys, it was rather frustrating and implications of of poor teaching were still made. I really hope that at some point the changes you suggest are made.

  2. Unknown
    on July 21, 2016 at 3:12 pm

    This comment has been removed by the author.

Leave a Reply

Your email address will not be published.

© 2024 Sig+ for School Data. All Rights Reserved.