The problem with progress: a guide

Six years on from the government announcing its intention to get rid of levels, many schools are still struggling with the issue of how best to measure progress. An increasing number, however, are coming to realise an inconvenient truth: that maybe it just can’t be done and any attempt to do so is a waste of time. What do these measures really tell us? What impact do they have on learning? And are they just abstract concepts generated to keep other people happy?

For schools that have been measuring progress in various ways using various systems for the past 20 years, the idea that progress is not a measurable entity may seem ridiculous, but certain facts do need to be taken into consideration before we plough on regardless. Governors – possibly the key audience for this data – certainly should be aware of developments and current thinking around schools’ use of data to measure progress.

Here’s a rundown of the main points:

Levels are dead

Levels of progress was a key accountability measure for so long it became part of the furniture. These were published measures with floor standards attached: pupils must make two levels between KS1 and KS2; and three levels between KS2 and KS4. The DfE then published the percentage of pupils making this ‘expected progress’. There were a number of problems with this, the main one being that they implied that progress was linear and making the same number of levels from any point was of equal value. In a primary school therefore, a pupil progressing from level 2 to level 4 (the supposedly ‘normal’ journey) was counted as having made the same amount of progress as pupils progressing from a level 3 to a level 5; or from a 1 to a 3; or below level 1 to a level 2. This was a fallacy: all these journeys are different. They are not comparable, and they are not equal in terms of probability.

The next issue arose when levels were subdivided for the purposes of measuring progress over shorter periods. Levels were split into poorly defined sublevels, which were then assigned points, and before long ‘3 points per year’ and ‘a point per term’ became a common currency in schools, with local authorities and with Ofsted. Schools were having serious conversations about whether a pupil was at 2B or 2B+ despite no one being able to define what the difference actually was. Ever smaller increments were invented to provide a system capable of measuring ever smaller steps of progress. This was an illusion.

In the end, the system of levels became so broken and misappropriated that the DfE took the decision to scrap them. The hope was something more meaningful would take their place.

Levels were reinvented

Sadly those more meaningful systems didn’t materialise. Almost the instant the announcement was made to remove levels, perhaps inevitably they were reinvented. Three point per year systems – essentially rebadged sublevels – dominated, but soon more complex versions appeared claiming to measure even smaller increments: 4, 6, 7, 9 even 10 steps per year. Schools (and suppliers of tracking systems) were just subdividing existing measures to give the impression of greater accuracy. But inventing more bands does not prove that pupils have made more progress. It just proves that someone has invented more bands.

In reality, the steps used in these systems were subjective, lacked robust criteria, and were prone to bias. It is perhaps inevitable that pupils would be placed in the lowest possible band at the start of the year and moved into a higher band at the end of the year in order to maximise the progress numbers. Data becomes more focussed on performance management and is quickly disconnected from reality.

Progress is not linear

As stated above, this was one of the major failings of levels, or rather the progress measures based on levels. Pupils do not all progress in the same way through the same content over the same period of time. Progress is messy, but messy does not make for a neat measure, so assumptions are made. Unfortunately, these assumptions – that there is a magic gradient that all pupils follow – do not match the reality. The measures that are commonly employed disregard reality in order to fit with the limitations of an excel formula.

The issue is illustrated by the following example. A teacher states that ‘expected progress’ is defined as a pupil maintaining ‘expected standards’ from KS1 to KS2. This sounds logical. The problems arise when we consider pupils that did not meet expected standards at KS1. Is remaining ‘working towards’ also expected progress? Or do we expected such a child to catch up? And what about pupils that were well below i.e. Pre-key stage at KS1? Do we expect them to remain as pre-key stage or catch up? For some, remaining pre-key stage could be considered good progress; for others that may be less than expected. Another issue relates to those pupils that were working above at the previous key stage. ‘Expected progress’ for them is to maintain those high standards, but in terms of simple measures, the ability to make ‘above expected’ progress is off the cards because they have already reached a ceiling. This was an issue with levels and is still an issue now.

This is because…

There is no such thing as expected progress

In 2017, Ofsted released an inspection update that stated:

‘Expected progress’ was a DfE accountability measure until 2015. Inspectors must not use this term when referring to progress for 2016 or current pupils.

This was an important step because it acknowledged the issues and gave schools the confidence to move away from the measures they had been shackled to up until that point. If Ofsted weren’t going to ask for such data, then perhaps schools could stop generating it and do something more useful instead.

But surely teachers have an expectation of progress?

Yes, definitely, but a teacher’s expectation varies from pupil to pupil and is often at odds with the simplistic measures that generate the numbers for governors and the like. Progress is a multi-faceted thing – it’s catching up, overcoming barriers, filling gaps, and deepening understanding, none of which are easily measured on their own, let alone in combination. This is an impossible task, one which Ofsted’s National Director of Schools described it as a “mugs game”.

When we talk about progress, we are usually referring to the pupil’s journey through the academic curriculum, but it could also relate to overcoming social and emotional barriers. Again, such things are near impossible to measure, and yet they affect the academic progress immensely.

How do you define a step or a point of progress?

In order to measure progress, you need a scale, and if it’s based on teacher assessment, this will inevitably mean reinventing levels in some form. Secondary schools produce ‘flightpaths’ by counting back GCSE grades and primary schools often have some kind of points-based system. But how do you define a point of progress? How does it differ from the ‘2B+’ mistakes of the past?

In many primary schools, each point of progress is simply based on how many learning objectives have been ticked off the list for each child, and these are apportioned equally according to the number of data drops. Basically, if you are measuring progress three times per year – and have 3 points per year – then you expect 33% of the curriculum to be secured by Christmas and 67% by Easter. Unfortunately, this is not how a curriculum is designed or delivered. No school expects all pupils to secure 33% of the year’s curriculum each term in each subject in each year group. It’s a massively simplistic assumption made in order to fit the rules of a system. It can also result in teachers, under pressure to ‘show sufficient progress’, ticking an extra box or two to push pupils over the next threshold and gain another point.

The result is a made-up metric distorted by the pressures of high stakes of accountability.

But doesn’t the DfE measure progress?

They do, and they’re not without their issues. The DfE use value added measures, which are near impossible to emulate in a school. They require access to national datasets in order to compare a pupil’s attainment to the average attainment of pupils with same start point nationally. Unless you have access to a huge amount of data from standardised tests on an ongoing basis, it’s not worth trying to second guess DfE measures. And even if you do have access to such data it’s unlikely to match those final accountability measures. Best not get distracted.

What about pupils with SEND?

This is a sticking point in many schools: the need to have a measure for those pupils working below. And schools tie themselves up in knots attempting to come up with a way of measuring their progress. All manner of systems have been created for this purpose, usually involving some kind of P-scale-style incremental scale. The problem, again, is the concept of expected progress. For some pupils, you may expect them to close the gap over time and to progress to the point where they can access tests. For others, with different needs and start points, good progress may be more modest. In some cases, the expectation may be to slow the rate of decline. Expected progress varies from pupil to pupil and can’t be measured. A far better option is to investigate the effectiveness of provision on a case-by-case basis, and to present individualised (but anonymous) data that provides important contextual information and assessment data over time, to build a picture of learning that informs discussion.

Fortunately…

Ofsted doesn’t look at internal data

The statement on expected progress made by Ofsted in 2017 was followed up in 2018 with a speech given by Amanda Spielman, Ofsted’s HMCI, in which she said:

we do not expect to see 6 week tracking of pupil progress and vast elaborate spreadsheets. What I want school leaders to discuss with our inspectors is what they expect pupils to know by certain points in their life, and how they know they know it. And crucially, what the school does when it finds out they don’t! These conversations are much more constructive than inventing byzantine number systems which, let’s be honest, can often be meaningless.”

This was a statement of intent. Ofsted would now be discussing pupil progress with teachers rather than looking at data. Narrative instead of numbers. The writing was on the wall for Ofsted’s engagement with internal data, and sure enough, when the new Education Inspection Framework was published, it contained this statement:

Inspectors will not look at non-statutory internal progress and attainment data on section 5 and section 8 inspections of schools

Perhaps more ominously, this was followed by:

Inspectors will ask schools to explain why they have decided to collect whatever assessment data they collect, what they are drawing from their data and how that informs their curriculum and teaching

Schools would now have to justify the data they collect both in terms of the quality of the information it provides, the impact it has on learning, and the effect it has on teachers’ workload.

Ofsted were going to war against bad data. Why? Because too much time is wasted in pursuit of numbers that have no impact; and because all too often the data is of low quality, and even – where stakes are too high – made up.

What does this mean for schools and governors?

It means that schools have license to ditch abstract measures that have very little impact on learning, and overhaul labour intensive data collection processes. It also means that governors should question the usefulness of the data that is presented to them, enquire about its impact on learning, and ponder the possibility that data is being produced solely for them. Governors can help reduce workload: don’t ask for low value data you don’t need and don’t influence the school’s assessment policy.

And we need to shift our perception of tracking systems. They should be seen as data libraries for storing a wide range of useful information to build up a picture of learning over time, rather than as tools for measuring progress. It’s our obsession with measuring progress that breaks assessment.

Ultimately, progress should be a narrative, not a number.

Subscribe to receive email updates when new blog posts are published.

Share this article

9 thoughts on “The problem with progress: a guide

  1. Leigh
    on July 16, 2020 at 9:35 pm

    That’s the issue in a nutshell! Well done!

    1. James Pembroke
      on July 17, 2020 at 10:23 am

      Thanks Leigh

  2. Gill Leonard
    on July 19, 2020 at 10:17 am

    This absolutely sums up the difficulties we have been struggling with for years. In my case as headteachers and now as Chair of Governors .Thank you

  3. Alan
    on August 18, 2020 at 9:46 am

    James, really good article. I have been reading up on the very subject of your article for a number of years now. For instance Christodoulou’s “Making Good Progress”, all very much aligned with what you are stating here. However, what I cannot find are examples of approaches to this area which offer an alternative to the reinventing levels approach. Do you have any more specific suggestions on this side of things? Or know of schools adopting alternative and reliable approaches to measuring progress?

    1. James Pembroke
      on August 20, 2020 at 8:24 am

      Hi Alan. Pleased to hear you liked the post. Easiest (and most common approach) is to make an assessment that indicates if students are working securely/keeping pace or not ie are they currently working below, doing OK, doing really well. Clear and simple and no hierarchical flight path. This can be backed up with other data such as standardised tests. Happy to have a chat if you like. Email me (address on the website).

  4. Nicola
    on October 21, 2020 at 8:25 am

    This is such an inspirational read- I am one of those headteachers with a reinvented points system. We are a low attaining primary school. We are moving away from progress measures slowly and the emphasis has shifted but it takes a nerve, especially as we had Ofsted in 2019 (admittedly under the old framework) where progress data ‘saved’ us and secured our Good grading. Determined to find a solution that works for us.

    1. James Pembroke
      on October 21, 2020 at 9:38 am

      Thanks Nicola. Pleased to hear that the post was useful. Let me know if I can help at all.

  5. Emma Bircham
    on January 28, 2022 at 5:18 pm

    Hello James,
    This is such a sensible article and having taught for 25 years I cannot tell you the number of conversations about pupils being individuals and don’t progress in a linear or expected way. Do you know of any schools that are using this approach successfully?

    1. James Pembroke
      on February 8, 2022 at 2:08 pm

      Hi Emma. Sorry for late reply; pleased to hear you liked the post. Lost of schools are tracking in more sensible and meaningful ways. I’d be happy to have a chat about it if that would help. Feel free to email me (see website for contact details)

Leave a Reply

Your email address will not be published.

© 2024 Sig+ for School Data. All Rights Reserved.