The Zero Game

Until this year VA has played second fiddle to the levels of progress measure. This was mainly because there were no floor standards linked to VA. But it was also because – let’s be honest here – not many people really understood it. Everyone understood pupils needing to make two levels of progress (and hopefully three levels of progress) but we struggled with the world of prior attainment groups, estimates, and confidence intervals. But now that levels are gone and all we’re left is a value added progress measure, we have no choice but to get our heads round it. So, we read the primary accountability document and have seen the lookup table on p16-17; we understand there are 21 prior attainment groups (a reduction in start points in previous years due to change in methodology); that each of these prior attainment groups has an estimate in reading, writing and maths, which represents the national average score for pupils in that group; that these estimates form the benchmark for each pupil; and that exceeding these scores ensures a positive progress score for each child, which will aggregate to a positive progress score overall. We get this now.

And that’s where the trouble started. 
Up until recently, schools were flying blind. With a new curriculum and new tests, unsure of what constituted expected standards, and no idea of ‘expectations’ of progress, schools just concentrated on teaching and tracking the gaps in pupil’s learning. We even started to question the methods of our tracking systems, with their pseudo-levels and points-based progress measures. Things were looking positive. The future was bright.
But then we saw the checking data, and that lookup table appeared, and I produced my VA calculator, and FFT published their 2016 benchmarked estimates.  Now it seems that many schools are playing a VA game, working out where each pupil needs to get to in order to ensure a positive progress score; comparing benchmarked estimates (that are no doubt too low for next year) against predicted results to model VA in advance, to provide figures to settle nerves and satisfy those scrutinising schools’ performance.
I understand that schools want a positive VA score when the stakes are so high but we have to consider the potential risks to pupils’ learning by focussing on minimum ‘expected’ outcomes. I am particularly concerned to hear that schools are building systems that track towards these estimated outcomes, using teacher assessment or optional tests as a proxy for expected standards, as a predictor of outcome that can then be compared against the end of key stage progress estimate. I think of the ideals of ‘learning without limits’ and the sound principles for the removal of levels, and wonder if anything has really changed. I also wonder if it was wise to publish my VA calculator. All those schools inevitably using it to generate estimates for current cohorts; estimates that are being entered into systems and somehow tracked towards. Am I part of the problem? 
Has a knowledge of progress measures become a risk to children’s learning? 
How about we just put the blinkers on and concentrate on teaching? Look after the pennies and the pounds will take care of themselves. 
Just a thought. 

Subscribe to receive email updates when new blog posts are published.

Share this article

5 thoughts on “The Zero Game

  1. Anonymous
    on October 16, 2016 at 9:38 am

    I don't think you need to be as concerned as you might be.
    You're right that it's a very similar process to what happened before in that people are aiming for predetermined end-points, but the design of the new system makes this less problematic than before.
    Firstly, schools are likely to pick up on those children whose current attainment suggests they might fall far short of their estimate. That, to me, is a good thing. Those are exactly the sorts of pupils we want a 'tracking' approach to highlight to help stem the falling behind.
    Secondly, in the past if a child was on track to meet their 2 Levels of Progress, there was little incentive (awful but honest) to help them progress further, unless they were in with a chance of making 3LP. Now there is still much merit in further increasing their progress, as every additional scaled score point counts.
    Finally, the fact that there are no clear steps in between the Key Stages tempers some of the old problems. That constant uncertainty will lessen the narrowing of booster groups etc.
    In some ways, with nothing to aim for for most pupils, we'd risk focussing too much on the 100 score – and that has to be a worse thing for the majority of pupils. So maybe you're helping to contribute to a better approach – particularly for those pupils who might otherwise go unnoticed?

  2. James Pembroke
    on October 16, 2016 at 9:53 am

    Thanks Michael. That all makes sense and I get that but alarm bells ring when schools are building tracking systems that clearly focus on hitting these predetermined estimates. I also have an issue with the multitude of ways that schools are attempting to arrive at predictions (many got it horribly wrong last year). So, an aggregated prediction of say 55% on track to meet the expected standard may seem rather concerning in isolation but compared against an estimate of 48% based on prior attainment it suddenly seems fine and may cause the school to breathe a sigh of relief.

    Maybe you're right, maybe I'm worrying unnecessarily, but I'm seeing changes in approach to tracking that bother me.

  3. @eduCardtion
    on October 16, 2016 at 10:21 am

    Morning chaps. Predictably, you both make equally valid and thought-provoking points. James, I, too, am deeply concerned about the 'transparency' of this algorithmic system. I was listening to a head last week talk about her weekend with her husband in front of a VA calculator (she didn't reveal if it was a Pembroke's own) mapping out exactly where each child needed to get to this year in order to obtain a progress score that bettered last year and was intending on this being an annual 'target setting' process. I struggled to hide my distaste.
    A few years ago, after my first visit to Wroxham, I recall writing as part of my summary notes "We have, for too long, focused in the wrong place. On the end level / score / result, rather than the granular steps in our teaching and learning pedagogy that will go toward our desired outcomes"
    At the time, I was pushing our school out of 'target setting' as a process as I felt it was the antithesis of learning without limits and was actually a limiting factor in teachers teaching well and learners being challenged every day. We stand by that now.
    John Tomsett also fuelled me some time ago with a quote along the lines of getting te ethos in the school right and the results will look after themselves. Whilst that, in itself, is sweeping and doesn't take into account the devil behind the detail of 'getting the ethos right', it once again resolves me to our course for this academic year.
    So, James, I have not downloaded and nor will I use your VA calculator even if it is winking at me in the corner. We are holding true to our course. We believe, and I can feel the weight of support behind me from the wonderful staffing team at our school when I say this, that by focusing on the nitty gritty of day to day teaching and learning, of the granular steps to key objective success that our professionals know so well, we will, somehow, be ok.
    In the words of my teen (and constant) hero, Rick Witter from Shed 7: "I could deny, but I'll never realise, I'm just chasing rainbows all the time."
    Perhaps it's better to chase, and feel like you gradually get closer to, a pedagogical ideal than forever chase a moving target?
    Lee Card

  4. James Pembroke
    on October 16, 2016 at 10:38 am

    Thanks. Eloquently put. You should have written my blog. But yes, the point about nitty gritty, granular detail is exactly the point I was trying to make with my 'look after the pennies…' analogy. I'm seeing tracking systems evolving in a scary direction with schools even attempting to covert TA into a pseudo-scaled score or assuming standardised scores from GL/NFER etc tests equate to scaled scores without appreciating the difference. All of this done to track towards a scaled score estimate. I get it for Y6 but when it filters down to other cohorts, it's not a good thing.

    There are pros and cons of course: on plus side having an aggregated estimate that suggests 45% will meet expected standards is useful in countering demands of certain people demanding 75%. But it does mean that a school may just settle for anything above 45%.

    Anyway, thanks again for your comment. Appreciated.

  5. dataman
    on October 16, 2016 at 4:43 pm

    Always dangerous to try to use a VA model for target-setting – just look at the difference between the 2015 and 2016 Progress 8 models, which has no doubt just caught out many secondary schools who have found their figure to be quite a bit lower than they had reckoned it would be.

    On the other hand, using VA calculators as a tool to judge how different pupils in the school are progressing *relative to each other* could be useful, as a means for determining where extra support might be needed.

    On the *other* other hand, perhaps we should all just focus on teaching the kids the stuff they need to know. The data will happen regardless, like death and taxes.

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2024 Sig+ for School Data. All Rights Reserved.