2019 ASP summary template

2019 version of ASP summary template free to download here.

Some tweaks on last year’s template – now takes account of new ASP menus

Feel free to modify, copy and share. Just credit the source and please download it first before attempting to complete it (it will open in Word online. To download, click on 3 dots in top right window of browser).

Calculating ‘Impact Scores’

If you are confused by the ‘Impact scores’ concept in table 2 (and who can blame you. I made up that term, by the way), the idea is to find the minimum score required to improve an overall progress score from below average (orange or red) to average (yellow); or from average (yellow) to above average (green). The former is most critical and often it’s a case of just removing one pupil from data.

Schools that are below average (orange) will have a negative progress score (e.g. -1.9) and a confidence interval that is entirely negative (e.g. -3.6 to -0.2). If the confidence interval does not include the national average of zero – i.e. it does not cross the zero line – then it is deemed to be significantly below average (as in the example given above).

It would be neat to find out if removing one pupil would improve our data from below (orange) to average (yellow). Let’s return to our example above. We take the upper limit of the confidence interval (the right hand number, i.e. -0.2). This is how far the progress score is away from average; how far the confidence interval is away from the zero line (safety!). Essentially, if every pupil’s progress score increased by 0.2, the overall score would be in line with average, but that doesn’t really help.

A better approach is to take that figure of -0.2 and multiply by the number of pupils included in progress measures (clearly stated in ASP). Let’s say that’s 30 pupils:

-0.2 x 30 pupils = -6.

This means by removing just one pupil with an individual progress score below -6, the ‘below average’ (orange) indicator will change to ‘average’ yellow.

Note: if your progress scores are average (yellow) and you want to determine what it would take to make them above average (green), use the lower limit of the confidence interval (the left hand figure) instead. Same applies: multiply that by the number of pupils, and if you have a pupil with a negative score equal to the result, removing a pupil with a progress score lower than the result should change overall scores from average to above.

Calculating ‘difference no. pupils’ figure

Many of the table in the summary template refer to ‘diff. no. pupils’. This is an attempt to make sense of percentage gaps (i.e. the gap between a school result and the national figure) by converting that gap into a number of pupils. In many cases the gap will be smaller than the percentage value of one pupil. For example, if you have 20 pupils in a cohort – i.e. each pupil is accounts for 5% of the cohort – and your school’s result is 4% below national, then the difference is less than one pupil. It is useful for those analysing the data (eg governors) to see gaps in these terms.

The easiest way to calculate the ‘diff. no. pupils’ is to take the percentage gap from national and calculate that percentage of the cohort. That will convert the percentage gap into a pupil equivalent figure. Don’t forget to round down. Here are a couple of examples:

  • National result = 78%
  • School result = 72%
  • Gap = -6%
  • Number pupils in cohort = 30
  • -6% of 30 pupils = -1.8

In this example, the gap equates to 1 pupil (after rounding down). In other words, one more pupil achieving the result would bring it in line with national.

  • National result = 65%
  • School result = 82%
  • Gap = 17%
  • Number pupils in cohort = 60
  • 17% of 60 pupils = 10.2

In this example, the gap equates to 10 pupil (after rounding down). Crudely speaking, the school’s result is 10 pupils above national.

Note that if you were calculating this figure for a group (e.g. boys or FSM pupils) you would use the number of pupils in that group, not the number in the whole cohort.

Hope all that makes some kind of sense. If it doesn’t, tweet me and I’ll do my best to explain it again.

Subscribe to receive email updates when new blog posts are published.

Share this article

James Pembroke

Data Analyst and Advisor with 10 years experience in primary, secondary and post-16 sectors. Follow me on Twitter: @jpembroke

  1. Graham C on November 7, 2019 at 6:05 am

    Hi James,
    You’re doing a brilliant job of cracking the data open wide.
    I just wanted to query if the 3rd example for different.no.pupils should have read ‘17% of 60 pupils’ not 30.
    Overall, article could really help schools to see how far away from average they are.

    • James Pembroke on November 7, 2019 at 7:28 am

      Hi. Good spot, thanks. Now corrected.

Leave a Comment