Tracked by the Insecurity Services

Last night @LizzieP131 tweeted this:

Which was followed by this:

In the past week I’ve been told by headteachers using one particular system that their pupils need to achieve 70% of the objectives to be classified as ‘secure’ whilst another tracking system defines secure as having achieved 67% of the objectives (two thirds). The person who informed us of this was critical of schools choosing to adjust this up to 90% and I’m thinking “hang on! 90%  sounds more logical than 67%, surely”.
And then this comes in from @RAS1975:

51%?

Really? 
Achieving half the objectives makes you secure? 
It’s like a race to the bottom.
So, secure can be anything from 51% upwards. And mastery starts at 81%.
I’m sorry but how the hell can a pupil be deemed to be secure with huge gaps in their learning? And how can a pupil have achieved ‘mastery’ (whatever that means) when they only achieved 4/5th of the key objectives for the year?
It makes no sense at all.
This is what happens when we insist on shoehorning pupils into best-fit categories based on arbitrary thresholds: it’s meaningless, it doesn’t work and it’s not even necessary. 
It’s also potentially detrimental to a pupil’s learning. Just imagine what could happen if we persist in categorising pupils as secure despite them needing to achieve a third of year’s objectives.

Ensuring that pupils are not moved on with gaps in their learning is central to the ethos of this new curriculum. Depth beats pace; learning must be embedded, broadened and consolidated. How does this ethos fit with systems  that award labels of ‘secure’ despite large gaps being present in pupils’s knowledge and skills?

The more I look at current approaches to assessment without levels the more frustrated and disillusioned I become. System after system are recreating levels and we have to watch them. They may call them steps or bands but they are levels by another name and are repeating the mistakes of the past. Pupils are being assigned to a best-fit category that tells us nothing about what a pupil can and cannot do and risk being moved on once deemed secure despite learning gaps being present. This is one of the key reasons for getting rid of levels in the first place.

So, take a good look at your system. Look beyond all the bells and whistles, the gizmos and gloss, and ask yourself this: does it really work?

And please, please, please, whatever you do, make sure you….








Subscribe to receive email updates when new blog posts are published.

Share this article

2 thoughts on “Tracked by the Insecurity Services

  1. xxeasternstarxx
    on June 25, 2015 at 3:10 pm

    Hi – Just found your blog and reading with enjoyment thank you. I find this conundrum very interesting. I work for a tracking company and have seen and supported a very wide range of new methods and philosophies to assessment and progress over the last year. In my opinion the best ones are always those which rely on the judgement of a highly qualified professional (teacher) – not those which follow a formula or generalisation.
    We try to make formative assessment which relies on coverage and depth of learning central to everything, easy to monitor and very visual but given the pressure schools feel to predict floor targets and compare cohorts we also have a stepped process driven by a summative judgement made on these depth of learning assessments. Progress has also formed a big part of the latest Ofsted publications – we look at the keystage performance descriptors for this and then leave it to the school how they wish to plot their path between them.
    Summative data certainly has a place in making sure that pupils are moving forwards but is by no means the whole picture. In my opinion there is also nothing more irritating than (essentially meaningless) broad sweep statements such as below, secure etc without an explanation of what that actually means – at that time.
    These two factors of depth of learning and pressure to progress to a particular point (or beyond) do create a juxtaposition which in turn feeds insecurities and leads to a search for an 'answer' …… resulting, in some cases, to a reliance on statistics such as the %s you highlight above. It concerns me that so many schools are being sold a 'solution' when there are so many unknowns that there essentially isn't one yet. By that I don't mean a tracking solution as that can be adjusted – more a solution that decides how 'good' or 'secure' a pupil is for you. If you don't have the other info along side what does that actually mean?
    Anyway – as you can tell I feel very strongly about this. I think that things are improving and that the 'race' is slowing down but am less sure about some of the interpretations of many of the key points to assessment.
    Right I could waffle on for hours – look forward to reading more future posts and happy to talk data at anytime (which I appreciate is strange).

  2. James Pembroke
    on June 26, 2015 at 2:59 pm

    Hi. Thanks for your comment – I agree with you entirely. Any chance you could tell me who you work for? Interested to know about approaches taken by various systems. Sounds like yours is on the right track (pardon the pun)

Leave a Reply

Your email address will not be published.

© 2024 Sig+ for School Data. All Rights Reserved.