On single malt and mastery

You may be surprised to hear that I don’t know a great deal about whisky making. OK, maybe not, but I’d wager that if you were in the business of ageing a 12 year old single malt, you probably wouldn’t expect to notice much difference if you tasted it every 6 weeks. You might think you can, but it’s probably just because you made up your mind before testing it. 


In recent weeks I’ve blogged about the dubious approaches of certain tracking systems towards assessment without levels; how they are attempting to shoehorn the new curriculum into the old methodology of points scores whilst trying to convince us they’ve done something new. I’ve also written about the pressure many primary schools are under – particularly the more vulnerable ones – to continue quantifying small steps of progress in ‘APS’; to ‘prove’ that pupils are making ‘good’ progress over short periods of time even though such measures are meaningless, irrelevant, and at odds with the new curriculum. 

My big fear is that many people involved in school improvement are still defining ‘good’ progress in terms of extension in a curriculum that focuses on depth; and that some tracking systems, with their adherence to the old orthodoxy of points – albeit with a bit of window dressing – are not exactly helping schools escape the gravitational pull of levels.

Senior leaders are understandably anxious. When I discuss these issues in schools – my main topic of conversation these days – someone will always say:

“But we have to show progress”

This tells us a lot about the primary role of data in many schools. It exists not so much to inform teachers and senior leaders about pupils’ learning; but rather for the purposes of accountability. A defence shield against scrutiny and judgement. The data is becoming disconnected from the classroom as a consequence and, worse still, may not mean anything at all. 

Eventually, after some debate, we move away from “but we have to show progress” and arrive at the inevitable question:

“So, how do we show progress?”

My original answer of “I don’t know”, which was usually met with howls of derision and gnashing of teeth, has now been replaced by the slightly more definitive “I’m not sure we can”. This needs qualifying. I’m not saying we can’t measure progress; I’m saying we can’t measure progress in the way we’ve become accustomed. A curriculum in which most pupils are expected to learn at broadly the same pace, and where extension into the next year’s content is the exception rather than the rule, limits our options for measuring progress in terms of coverage. The depth of pupil’s understanding – the degree to which they can use and apply what they’ve learnt – is the key now, but it’s not something that is easily measured. And assigning values to such an abstract concept is most likely a fallacy especially if we begin to try to show changes in depth of understanding over short periods just because “we have to show progress”.

Essentially, we must resist the temptation to quantify something that can’t be quantified just to satisfy demands based on some obsolete notion of progress. When the data is without foundation and is at odds with reality, it is akin to knowingly navigating with an out-of-date map and still expecting to arrive at your destination.

To put it bluntly, don’t make it up.

So, let’s return to the original question: “How do we show progress?”

I’ve touched on tracking systems above, and blogged on the subject extensively. There are clearly issues with certain approaches and schools will no doubt be finding these out for themselves now that we are halfway through the year. I found myself discussing one such approach with a headteacher I met in the street recently and after trying to recall and explain its complexities for 10 minutes or so I had a slight out-of-body experience. To anyone listening in, it must have sounded like we were talking total nonsense. 

Surely what we want is simplicity. If levels, and their sublevel and APS offspring, are too complicated, then whatever replaces them needs to be more straightforward. Most systems are settling on an approach involving 3 broad steps of learning, often with further subdivisions. Michael Tidd’s key objectives model sticks to 3 simple categories of ‘below’, ‘meeting’, and ‘exceeding’. Target Tracker’s system involves 6 steps across the year from ‘below/below+, through working within/working within+, to secure/secure+. OTrack have opted for a 7 point scale which includes 3 for ‘working towards’ and a final category of ‘exceeding’, which sits above ‘mastery’ (which makes sense to me). Focus Education have categories of ’emerging’, ‘expected’, and ‘exceeding’, each subdivided into 3 steps to give a 9 point scale, the 9th point being reserved for those pupils that have shown mastery of all main objectives and completed all extension activities. Insight Tracking, who aren’t tied to any particular assessment model, have developed a flexible system that accepts pretty much any metric. 

Another system with a 6 point scale, similar to that employed by Target Tracker, is Chris Quigley’s Milestones. There is, however, one crucial difference: the Chris Quigley model assesses a pupil’s depth of understanding across a 2 year period, the expectation being that the pupil reaches step 4 by the 2 year milestone with points 5 and 6 indicative of mastery.

Such a system, which eschews attempts to quantify small steps of progress in favour of a long term mastery approach, is daunting to some, especially those that “have to show progress” since the start of term, but that’s what makes this it compelling. 

So, ask yourself this: who are you assessing for? Does your data actually tell you anything about your pupils’ learning? Or is it for another purpose?

It’s time to entertain the possibility that maybe it’s just not possible to quantify small steps of progress. And if that’s the case, why bother continuing to do so? The sooner we move away from this deeply entrenched progress fallacy, the sooner we can start to assess in a meaningful way, one that actually informs us about pupils’ learning.

This is, after all, why levels were removed in the first place.

Wasn’t it?











Subscribe to receive email updates when new blog posts are published.

Share this article

9 thoughts on “On single malt and mastery

  1. Sarah Davey
    on February 25, 2015 at 9:51 am

    A great read! I've always interpreted the Chris Quigley approach as enabling you to show advancement (ok, progress!) for a pupil even when they've only deepened their understanding of one or two objectives between assessment points. Eg, their Depth of Learning index would increase (admittedly by just 0.1) if they made the minimum advancement in just 1 of 10 objectives. Feels like a nice way of measuring coverage and depth of understanding together. I may have misinterpreted, of course, and I don't mean to defend the idea that schools should have to show progress every five minutes! Just interested to know your thoughts.

  2. James Pembroke
    on February 25, 2015 at 11:09 am

    but isn't the average depth of learning index taken across a number of strands/subjects so progressing by 1 out of 10 objectives in one area won't make a huge difference to the overall score when it's averaged across subjects? Maybe I've misinterpreted.

  3. Sarah Davey
    on February 25, 2015 at 11:24 am

    I think you get a Depth of Learning Index for each subject, which is the average of the score for each of that subject's Learning Objectives. Reading had 2 learning objectives and Writing 10. So progressing by 1 out of 10 writing objectives makes a small but detectable difference to the Depth of Learning Index; progressing by 1 in Reading would make more difference. I'm not sure I've fully grasped the approach, and my information may also be out of date… Need to hear from the source!

  4. James Pembroke
    on February 25, 2015 at 11:32 am

    Yep, sounds like I got the wrong end of the stick. Not sure why I said average across subjects. That makes no sense :-0

  5. Anonymous
    on February 25, 2015 at 4:04 pm

    Aren't you asking a quite straight forward question; over what time period can learning at primary be useful measured?

  6. James Pembroke
    on February 25, 2015 at 4:11 pm

    Maybe, but I'm also trying to hammer home the fact that a) our approach to 'measuring' progress has changed, and b) many are getting it wrong.

    and that I don't have the answers.

  7. david
    on February 26, 2015 at 9:01 pm

    Hello James – couldn't agree more with everything you said. I was wondering though – is it written anywhere about being allowed to move children into the band above? You said in exceptional circumstances but we keep on being told that we mustn't at all. In a school where attainment is v high, we feel we must but dont feel confident doing so against advice. Do you have an authoritative source for this advice please?

  8. James Pembroke
    on February 26, 2015 at 9:23 pm

    No guidance. Just going on discussions with heads, teachers and others. Think if pupil demonstrates mastery of all learning objectives then surely they need to move on?

  9. david
    on February 26, 2015 at 9:26 pm

    Thanks for the reply!
    We agree but keep being told to only broaden,broaden, broaden – ad nauseum

Leave a Reply

Your email address will not be published.

© 2024 Sig+ for School Data. All Rights Reserved.