Exam grading: What are the key challenges ahead?

Written by: Tom Middlehurst | Published:
Image: Adobe Stock
interesting article , but why the tacit assumption that England = UK? Why no mention of SQA and ...

Posted by: ,

Several trends are emerging from the concerns that school leaders have about this summer’s exam grading process. Tom Middlehurst looks at the main sticking points and offers advice and reassurance for secondary school leaders and teachers

In the coming weeks, further guidance and support about this year’s grading process will continue to be released from Department for Education (DfE), Ofqual and the exam boards – including more exemplifications of grading, and long-awaited detailed about the appeals process.

In the meantime, schools and colleges will have submitted their high-level Centre Policy by the end of April and, in some schools, the process of assessment is already beginning.

As school and college leaders have digested the various Ofqual and Joint Council for Qualifications (JCQ) guidance documents over the Easter break (see further information), several trends are emerging from the on-going concerns that school leaders express to us at the Association of School and College Leaders.

The selection of evidence

Perhaps one of the biggest challenges this term will be deciding what evidence to use at a subject level. In most cases, school and college’s Centre Policies will not go into this level of detail, but may articulate some broad principles and cross-centre approaches to evidence.

The key thing to remember here is that students should only be assessed on what they have been taught – a fundamental part of the process aimed at addressing differential learning loss between schools and between students due to the pandemic.

A trickier question, then, is the extent to which all content that has been taught should be assessed. When we say “all content”, we do not mean assessing every single concept, knowledge or skill in a subject. Indeed, in a normal exam year, depending on the qualification, only around 50 per cent of the course content might be assessed.

Instead, we are talking about content coverage at a broad topic or paper level. For example, if you have taught three of the five texts normally assessed in GCSE English literature, do you have to assess students on all three? If students have been taught all six exam topics in GCSE geography, do you have to assess all six?

Both the Ofqual and JCQ guidance documents are pretty quiet on this issue, so it is up to individual schools and colleges to make these sorts of granular decisions.

A key change from the first iteration of the Head of Centre Declaration and the published version (JCQ, 2021, p39) is the switch from a requirement for heads to be confident they have covered enough content “for students to progress”, to confidence in having enough content coverage “to award a grade”. But beyond that, it really is up to schools and colleges to decide what sufficient content coverage, and a broad enough portfolio, looks like.

A word of warning though: Although there is nothing explicit in the guidance to say that all taught topics should be assessed and used as part of the evidence, it is worth thinking ahead to appeals. One of the grounds for appeal is that the selection of evidence does not represent a reasonable exercise of academic judgement. If a student can demonstrate they were taught a whole topic or paper but were not assessed on it, this would be very good grounds for an appeal of this nature. It is therefore in school’s and college’s interests to assess as broadly as possible.

Common approach to assessment vs individualised evidence

There seems to be some confusion over the extent to which, at a qualification level, schools and colleges should be using a consistent approach to evidence between all students; and how much flexibility there is for a personalised approach.

The guidance is clear that, as much as possible, centres should adopt a common approach to assessment and evidence at a subject level; based on what the cohort has been taught. Where individual students or small groups of students were either not able to access the teaching of this content, or are unavailable to complete the assessment, schools and colleges should factor this into their grading and either discount that evidence or seek an alternative form of evidence.

However, this is not a negotiation between student and centre, nor students “picking” their own assessment evidence. Rather, all students need to be aware of what evidence is being used, and have an opportunity to raise any concerns or contextual issues with the school. Some schools are managing this process centrally to reduce classroom teacher workload and ensure a consistent approach to that candidate is used.

Performance vs historical outcomes

A contentious issue last year and this – to what extent should students’ grades be aligned or based on historical outcomes of the centre?

The data shows that, especially for large-entry subjects, very few schools and colleges see significant year-on-year variation in their attainment measures. Before the government’s U-turn last summer, this led to a statistical attempt to standardise results, which was widely seen as deeply flawed.

So, how should schools and colleges use historical data this year? The answer we have to keep returning to is that, this year, students must be graded on their evidence performance in content they have been taught, against the national grade descriptors and exemplifications.

Historical data should figure strongly in a school’s or college’s internal quality assurance, as a benchmarking exercise or high-level check, but must not be a limiting factor to any student achieving a certain grade – providing you have the evidence for this.

Furthermore, given that the nature of assessment is fundamentally different from a normal exam year – and from last year – then we may see more students achieving more higher grades – and schools and colleges should not be worried about this, providing they have the evidence for this.

Ultimately, it is about the robust evidence of a student’s performance, underpinned by rigorous quality assurance including the use of historical exam data.

Pressure from students and parents

If you look at the headlines in the non-specialist press, it is easy to understand why many students and parents believe that grades this year are solely determined, and awarded, by teachers. The language of “teacher-assessed grades” does little to counter those views.

We know that this has resulted in some students and parents, however well-meaning, putting unacceptable amounts of pressure on classroom teachers.

This pressure can take many forms, from reminding teachers of conditional university entry offers, to questioning every piece of evidence used, to questioning the academic judgement of the teacher.

Parents and students want to do as much as possible to boost achievement, but it is vital that teachers, and schools and colleges, remain objective in their grading.

Students and parents need to understand that grading is not at an individual teacher’s discretion this year; it is against a national standard; must be evidenced; will be subject to internal quality assurance including sign-off by at least three members of staff including the head; subject to external quality assurance including random and risk-based sampling of students’ work; and that it is possible to appeal the grade this summer.

Furthermore, JCQ has beefed up its guidance from last summer and is explicit that any student, parent or other individual trying to exert pressure on teachers to award a certain grade should be reported to the exam boards, who may investigate as exam malpractice and ultimately refuse to issue a grade at all (JCQ, 2021).


After everything is “done” by June 18, schools and colleges will naturally look ahead to August and the appeals process. The first stage of an appeal is to check for any admin or procedural error at a centre level. If none are found, a student may obligate the school to take forward an appeal to the exam board on their behalf.

Appeals will begin from published results days, with schools and colleges having until August 23 to send priority appeals to the exam boards. In reality this means that some staff will need to be working between August 9 and 23, which is likely to mean a change in resourcing for many schools.

Looking at what evidence will be required in the case of an appeal, and preparing that evidence as schools collate assessment evidence before submission, may significantly reduce the workload of stuff in August.


We do not think anyone believes that this year’s approach is perfect, but given the parameters we are under and the political decision to cancel all forms of exams, we are probably in the best place possible. We now owe it to our young people to make a success of this year’s process so that achievements can be meaningfully recognised.

  • Tom Middlehurst is curriculum and inspection specialist at the Association of School and College Leaders. Read his previous articles for SecEd via http://bit.ly/seced-middlehurst

Further information & resources

  • JCQ: Guidance on the determination of grades for A/AS Levels and GCSEs for Summer 2021, March 2021: https://bit.ly/3dxWKch
  • JCQ: Information and documentation on awarding arrangements for summer 2021: https://bit.ly/3nABiXw
  • Ofqual: Guidance: Information for heads of centre, heads of department and teachers on the submission of teacher assessed grades, March 2021: https://bit.ly/3al5NeD
  • Ofqual: Publications and documentation relating to GCSE, AS and A level qualifications in 2021: https://bit.ly/32tmFLE

interesting article , but why the tacit assumption that England = UK? Why no mention of SQA and arrangements in Scotland?
SEC ED addresses teachers, "in secondary education across the UK*. See ABOUT US below

Posted By: ,

Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin