Assessment: How accurate is your data?

Written by: Nathan Oxford | Published:
Image: Adobe Stock

One of the reasons behind recent improvements in outcomes at senior leader Nathan Oxford‘s school is a marked improvement in the accuracy of assessment data

I joined Alfreton Grange Arts College (now David Nieper Academy) in 2013 as assistant headteacher for curriculum and standards. Two weeks into the role, the small secondary school was placed in special measures. The school had been led by four different headteachers in as many years and the instability this created had finally come to a head.

A key issue highlighted in the 2013 Ofsted report was the disparity between forecast data and predicted data, which had contributed to underachievement at the school, particularly for boys, Pupil Premium students and those with special needs. Large inaccuracies in the data at each assessment point, sometimes by as much as 30 per cent, had led to a number of students not accessing key interventions, which resulted in inadequate outcomes for both students and the school.

GCSE results in 2014/15 rose from 27 to 41 per cent of pupils achieving five A* to Cs including English and maths. However, the results were still not consistent and predicted grades did not match outcomes. One subject which predicted that 55 per cent of pupils would achieve A* to C had an outcome of only 24 per cent.

In 2014, I enrolled on the Future Leaders programme for senior leaders and as part of this designed an impact initiative the aim of which was simple: I wanted to ensure that our data was accurate at each assessment point and, where this was not the case, that staff were held accountable. Ultimately, this would allow us to implement appropriate and timely interventions.

A focus on data

I began by considering who would need to use the new data system. Any new system would need to be easy to understand and, because we were a small school of 29 staff, be usable for our single subject teachers. I would need to include staff at all levels and, in some cases, change an historic view of how data was seen/used.

To successfully embed my initiative at the heart of a data-rich system, I made sure that the CPD staff received on data focused on the requirements for accurate data forecasting and developed their understanding of how this would enable us to deliver more targeted, accurate and personalised interventions to students. I examined where the gaps in training were and what skills could be transferred from teacher to teacher to maximise our resources.

Historically, the school had been too reliant on imposing reporting systems from above, rather than encouraging all staff to take ownership and accountability for the data. To counter this, I designed a data handbook for all staff with a set of values and aims aligned to the school’s ethos. The handbook detailed the data cycle for the year, identifying six key points for each assessment window. At each of these stages, the roles of senior leaders, middle leaders, teaching staff and support staff were clearly defined, as well as how to record the data and use it to improve outcomes.

Aiming for accuracy

Alongside the handbook I developed some supporting documents which I used to gather evidence and hold people to account throughout the data cycle. I designed an assessment diary for heads of faculty to complete at each assessment point, for each subject and year group, which records the assessment data, and asks staff to explain the key actions taken by them to ensure the data is accurate and the impact this has had.

To introduce this system, I planned set agenda items for middle leader meetings and senior leadership line management meetings to identify the key actions required and the obstacles we might face.

From the first assessment point, I used the RAG system to highlight any causes for concern, and we discussed the issues at the line management meetings. This system allowed us to closely monitor the progress of each student, in each subject, at each assessment point.

To further increase the accuracy of our data, I introduced book scrutiny as part of middle leader-teacher meetings to compare the data entered by the teachers with the data in the books.

This has since been extended and middle leaders now sample each subject, per year, across the academy on a regular basis to ensure that data submitted is both accurate and timely. Where possible, this is maintained and completed cross-faculty. For example, data from the media department, although within the technology faculty, is checked by the English department for accuracy against the quality of spelling, grammar and punctuation.

Sharing responsibility across departments meant that staff didn’t feel they were being isolated for scrutiny. Middle leaders also bought into the system as they felt that it would improve outcomes within their subject areas and would lead to further accountability within their faculty.

Identifying underachievers

To effectively identify students who were underachieving I created an assessment point review template. The template records the data by pupil group (gender, Pupil Premium, SEN and so on), the barriers to learning and progress, what action needs to take place in order to remove the barriers and the expected outcome of this action. The completed templates are then placed within a teaching and learning folder within the teacher classroom so they can be checked within a lesson observation or learning walk. It also means that any visitor to the teaching room has access to progress data and a detailed overview with action plans for the intervention strategies.

Effective communication

To facilitate an open and on-going dialogue about the assessment initiatives, I introduced a structured middle leader meeting agenda, distributed half-termly, which includes a set agenda item of “assessment”.

These meetings are a space to discuss assessment and outcomes, behaviour and attendance, and teaching and learning, as well as an opportunity to share good practice relating to key initiatives.

Middle leaders also now use this method of reporting, including a minuted discussion of assessment, with their own teams. When an issue is raised, we increase the frequency of the meetings to ensure that, at all levels, accountability within the system is clear.

As a result of this, staff have commented that they now feel more supported and that effective communication was one of the key tools to achieving a sustained improvement within faculty areas.

Although they follow a standardised format, these meetings enable middle leaders to develop their leadership skills on a one-to-one basis, rather than using ad-hoc means of communication or the weekly, timetabled faculty meeting.

Being able to have difficult conversations, about the importance of accurate data and the book scrutiny work, really helped me to manage the challenges which arose during the year, particularly with single subject teachers where support had previously been minimal and work scrutiny hadn’t been linked to data forecasting. A number of staff had also copied previous assessment point data to inform current attainment and progress.

Where issues arose, management conversations were minuted and support plans were introduced to ensure that the systems introduced were both robust and effective.

Assessing the impact

In summer 2016, the number of students achieving five or more A* to C GCSEs including English and maths rose to 51 per cent. Prior to this, the school’s results had fluctuated significantly and our analysis of the data shows that, although the prior attainment of the cohort had increased, the increased quality assurance of the data used for intervention and targeted monitoring was key to this improvement.

Data is now more accurate, being within five per cent of the final outcome, and the school now meets the government’s current floor standards.

I have introduced a system that holds both teachers and leaders to account when entering data and which consistently asks: “How do you know this data is accurate?” and “What is the evidence for this?”

Looking to the future, although the national curriculum and specification changes have made it difficult, I am confident that we are assessing, tracking and reporting on student “working at” and forecast grades accurately and that this will be reflected in the outcomes this summer.

  • Nathan Oxford is an assistant headteacher at David Nieper Academy in Derbyshire and a graduate of the Future Leaders programme run by Ambition School Leadership.

Ambition School Leadership

Ambition School Leadership is a charity that runs leadership development programmes in England to help school leaders create more impact in schools that serve disadvantaged children and their communities. Visit


Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin