The Data Trap

The most vulnerable schools – those in RI or special measures, or those identified as weak by an LA or sponsor – have the greatest data burden placed upon them. That’s a given. Despite the drive for schools to develop new ways of assessing and tracking pupil progress, these ‘schools causing concern’ have less room to innovate, and are probably less willing to do so. Better not rock the boat, just give them what they want. Even if what’s being asked for is a) going to take a huge amount of time and resources, and b) complete, irrelevant nonsense.


Meanwhile, schools in a more secure position – those that have recently been inspected and had a good outcome – are left alone to their own devices. They can discuss, research, develop and test new methods of assessments – find a system that works for them. They are free to innovate and take the necessary time to get it their approach to assessment without levels right, thus ensuring it is tailored to meet the needs of their curriculum. The way it should be and the whole point of removing levels in the first place. 

The vulnerable schools on the other hand are often finding that they are straightjacketed by data requirements placed upon them, and that these begin to dictate how they operate. Often, various parties involved – LA advisors, consultants, HMIs – are requesting different data sets, ramping up the workload even further. This results in a increasingly unnecessary data burden requiring greater system complexity, which is completely at odds with a general drive for simplification. Data risks becoming disconnected from its primary function – to inform teaching and learning – and, at worst, ends up as an exercise itself.

Rules for one

I recently visited a fantastic primary school in Gloucester to catch up on developments of their new assessment and tracking system. They are designing a hybrid system, taking elements and ideas from two assessment models, and working closely with a tracking system company to ensure that the data can be easily captured and analysed. This is as it should be: design the assessment system that suits your curriculum and then tailor the tracking to fit; and they have the time, resources and confidence to do this. Exciting stuff! 

Then I spoke to a headteacher in another county who is in a very different situation. Her school has been identified by the LA as one causing concern. Some of their results were lower than expected and she is now under pressure to prove that pupils are making better than expected progress, and that results will improve this year. Worryingly she told me that the advisor that the LA has assigned to her school made her feel deeply inadequate when it came to her understanding of data. However, in the ensuing conversation she explained her ideas for assessment and tracking in the new curriculum, and it almost exactly matched the plans of the school I mentioned above. Unfortunately this headteacher didn’t really feel confident enough to implement radical changes right now, opting instead to tow the party line. This headteacher had the right ideas but was being prevented from developing them because of the data demands placed on her school. An example of producing data purely for the sake of accountability rather than to meet the needs of the new curriculum and the pupils in the school. Bureaucracy stifles innovation.

One school I’ve been supporting has been told by their HMI that all pupils need to make at least 2 points of progress this term. Seriously, what is 2 points of progress these days? How does a pupil make a sublevel of progress in a curriculum without levels? The school lets out a collective sigh, shrugs its shoulders and mutters something along the lines of “well, I suppose we’d better give them what they want”. Consequently the school has decided to delay the transition to assessment without levels until they are out of the situation they’re in. The data has therefore become disconnected from its primary function of informing teaching and learning, and instead exists predominantly for the purposes of accountability. I struggle to see how this helps a school improve. 

In another school, a consultant, working on behalf of a sponsor, issued the Head with a data pack to complete. A behemoth requiring every conceivable shred of data broken down into every subject, cohort and group. It would take ages to complete; days at least, probably weeks. Meanwhile they have a school to run. 

And the madness is that the data pack was almost entirely concerned with percentages making expected and better than expected progress in terms of APS. None of this fitted with the tracking system the school has in place so the options were to a) adapt the data pack to fit the data (sensible), b) make some data up (not advisable but essentially what you are doing by producing progress data in APS), or c) rip it up and instead present some meaningful data that actually relates to what is happening in the school (my preferred option). 

These situations arise because someone in a position of influence and power is poorly informed about assessment and tracking; about what a school can and should provide. And these issues are disproportionately affecting vulnerable schools that really should be concentrating on other things. The big risk is that these schools either delay implementing a new assessment system in order to continue to satisfy these inappropriate and unnecessary data demands; or they end up running two systems in parallelI: one for teaching and learning, and one for bureaucracy. Neither are acceptable.

What’s reasonable, useful and relevant can now be boiled down to 4, maybe 5 indicators: % at/above age-related expectations, % making expected/better than expected progress, % on track to meet targets (for smaller cohorts, numbers may be more appropriate). That’s pretty much all you need and a year group can be presented on a single sheet, with rows for different groups. Such generic data is, and always will be, readily available from tracking systems, and should be enough to satisfy any LA advisor, SIP, Ofsted inspector or HMI. 

In case anyone needs convincing further, refer to paragraph 191 of the Ofsted handbook and commit it to memory. It could help keep control of your data:

‘Inspectors will not expect to see a particular assessment system in place and will recognise that schools are still working towards full implementation of their preferred approach. However they will evaluate how well pupils are doing against relevant age-related expectations as set out by the school and the national curriculum (where this applies).

And if you are ever asked to produce meaningless, obsolete data by someone who really should know better, then slap down a ten pound note on the desk in front of them and ask them to convert it into shillings. It’s pretty much the same thing.

I’ll wrap this blog up with the following thought:

In order to get assessment and tracking right we need to ignore the pressures of accountability and performance management, and pretend Ofsted doesn’t exist. Assessment is for learning; the rest is noise.


Subscribe to receive email updates when new blog posts are published.

Share this article

One thought on “The Data Trap

  1. Anonymous
    on January 23, 2015 at 9:55 am

    have also supported schools in special measures and completely concur, well meaning management consultants often make needlessly clumsy and huge data demands, suspect partly just to prove compliance, these requests often make improvement harder through burdening management teams with unfamiliar indicators which are often inappropriate in context

    consultants should not assume struggling schools always have defective monitoring systems

    if you are supporting a struggling school you should keep data requests simple, to the point and relevant to the school tracking systems and curriculum

Leave a Reply

Your email address will not be published.

© 2024 Sig+ for School Data. All Rights Reserved.