You may be surprised to hear that I don’t know a great deal about whisky making. OK, maybe not, but I’d wager that if you were in the business of ageing a 12 year old single malt, you probably wouldn’t expect to notice much difference if you tasted it every 6 weeks. You might think you can, but it’s probably just because you made up your mind before testing it.
In recent weeks I’ve blogged about the dubious approaches of certain tracking systems towards assessment without levels; how they are attempting to shoehorn the new curriculum into the old methodology of points scores whilst trying to convince us they’ve done something new. I’ve also written about the pressure many primary schools are under – particularly the more vulnerable ones – to continue quantifying small steps of progress in ‘APS’; to ‘prove’ that pupils are making ‘good’ progress over short periods of time even though such measures are meaningless, irrelevant, and at odds with the new curriculum.
My big fear is that many people involved in school improvement are still defining ‘good’ progress in terms of extension in a curriculum that focuses on depth; and that some tracking systems, with their adherence to the old orthodoxy of points – albeit with a bit of window dressing – are not exactly helping schools escape the gravitational pull of levels.
Senior leaders are understandably anxious. When I discuss these issues in schools – my main topic of conversation these days – someone will always say:
“But we have to show progress”
This tells us a lot about the primary role of data in many schools. It exists not so much to inform teachers and senior leaders about pupils’ learning; but rather for the purposes of accountability. A defence shield against scrutiny and judgement. The data is becoming disconnected from the classroom as a consequence and, worse still, may not mean anything at all.
Eventually, after some debate, we move away from “but we have to show progress” and arrive at the inevitable question:
“So, how do we show progress?”
My original answer of “I don’t know”, which was usually met with howls of derision and gnashing of teeth, has now been replaced by the slightly more definitive “I’m not sure we can”. This needs qualifying. I’m not saying we can’t measure progress; I’m saying we can’t measure progress in the way we’ve become accustomed. A curriculum in which most pupils are expected to learn at broadly the same pace, and where extension into the next year’s content is the exception rather than the rule, limits our options for measuring progress in terms of coverage. The depth of pupil’s understanding – the degree to which they can use and apply what they’ve learnt – is the key now, but it’s not something that is easily measured. And assigning values to such an abstract concept is most likely a fallacy especially if we begin to try to show changes in depth of understanding over short periods just because “we have to show progress”.
Essentially, we must resist the temptation to quantify something that can’t be quantified just to satisfy demands based on some obsolete notion of progress. When the data is without foundation and is at odds with reality, it is akin to knowingly navigating with an out-of-date map and still expecting to arrive at your destination.
To put it bluntly, don’t make it up.
So, let’s return to the original question: “How do we show progress?”
I’ve touched on tracking systems above, and blogged on the subject extensively. There are clearly issues with certain approaches and schools will no doubt be finding these out for themselves now that we are halfway through the year. I found myself discussing one such approach with a headteacher I met in the street recently and after trying to recall and explain its complexities for 10 minutes or so I had a slight out-of-body experience. To anyone listening in, it must have sounded like we were talking total nonsense.
Surely what we want is simplicity. If levels, and their sublevel and APS offspring, are too complicated, then whatever replaces them needs to be more straightforward. Most systems are settling on an approach involving 3 broad steps of learning, often with further subdivisions. Michael Tidd’s key objectives model sticks to 3 simple categories of ‘below’, ‘meeting’, and ‘exceeding’. Target Tracker’s system involves 6 steps across the year from ‘below/below+, through working within/working within+, to secure/secure+. OTrack have opted for a 7 point scale which includes 3 for ‘working towards’ and a final category of ‘exceeding’, which sits above ‘mastery’ (which makes sense to me). Focus Education have categories of ’emerging’, ‘expected’, and ‘exceeding’, each subdivided into 3 steps to give a 9 point scale, the 9th point being reserved for those pupils that have shown mastery of all main objectives and completed all extension activities. Insight Tracking, who aren’t tied to any particular assessment model, have developed a flexible system that accepts pretty much any metric.
Another system with a 6 point scale, similar to that employed by Target Tracker, is Chris Quigley’s Milestones. There is, however, one crucial difference: the Chris Quigley model assesses a pupil’s depth of understanding across a 2 year period, the expectation being that the pupil reaches step 4 by the 2 year milestone with points 5 and 6 indicative of mastery.
Such a system, which eschews attempts to quantify small steps of progress in favour of a long term mastery approach, is daunting to some, especially those that “have to show progress” since the start of term, but that’s what makes this it compelling.
So, ask yourself this: who are you assessing for? Does your data actually tell you anything about your pupils’ learning? Or is it for another purpose?
It’s time to entertain the possibility that maybe it’s just not possible to quantify small steps of progress. And if that’s the case, why bother continuing to do so? The sooner we move away from this deeply entrenched progress fallacy, the sooner we can start to assess in a meaningful way, one that actually informs us about pupils’ learning.
This is, after all, why levels were removed in the first place.
Subscribe to receive email updates when new blog posts are published.