Getting honest with your school's student data

Written by: Giselle Hobbs | Published:
Image: Adobe Stock

In an era of big data, we can easily lose our way navigating the wealth of information we collect. Assistant principal Giselle Hobbs decided to go back to basics with her school’s new approach to data

Entering data, analysing data, attending data meetings, nodding wisely at charts and graphs. After all of this time and effort, many teachers are left wondering – what’s the point?

In this era of “big data”, it has become de rigeur to collect as many data points as possible, and figure out what to do with them later. We count up, compare our results to the national average, to last year’s results, subject to subject. We split our data into ever-smaller sets, then extrapolate wildly.

By the time we have done all of that, the data is four weeks out of date and we have missed our chance for timely action. In the meantime, those doing the collecting feel hugely overburdened, and rarely see useful results from all the number-crunching.

My school, The Stockwood Park Academy in Luton, was no different to the many others trapped in this cycle. Despite collecting a huge volume of data and spending endless hours analysing it, our 2015 GCSE results in some subjects were both disappointing and surprising.

In our challenging context in the most deprived part of Luton, with an intake that’s bottom quintile for key stage 2 attainment, and top quintile for basically everything else – EAL students, Pupil Premium, boys – getting data right was all the more important.

We needed to redesign our system to make data work for us, and more than that to change the academy culture around data. In a high-accountability system, it is sometimes understandable that staff don’t feel able to give honest predictions for fear of being “in trouble”. But that means that opportunities to improve are missed. We needed to foster a culture of honesty and openness around data by making a system that no-one felt the need to “game”, and which staff members really found useful.

Last year (2015/16) was my first year as a member of the senior leadership team so, full of idealism, I selected four key principles four our new data system:

  1. Automation and instantaneousness.
  2. Real data only.
  3. Reliability.
  4. Openness.

Automation and instantaneousness

Making data-management quick and easy was the first step in building a data-positive school culture.

Teachers are not (generally) statisticians, so we should not ask them to spend a long time managing data. Instead, data needs to be held in a system which automatically calculates useful measures like progress, the percentage of students achieving key measures and optimal Progress 8 “buckets”. If those calculations refresh every time a teacher updates their markbooks then they are always current, and so can be acted upon.

We use a data-management system (called Go 4 Schools) that, among other things, allows teachers to create seating plans with each student’s data and picture on them in about 30 seconds. In addition, the system can hold full markbooks of real assessments, not just current predicted grades or “working at” grades.

In previous years, our long and detailed data tracking document took hours to complete; middle leaders reported an average of 15 hours per document. Within six months of implementing the new system, 90 per cent of middle leaders surveyed said that a full analysis took less than an hour. Now, everyone uses the system all the time. This means that staff become more proactive about using data to help students.

So – we had a system in which data-management was relatively easy. But what about the data itself?

Real data only

Teachers are used to having to “game the system”, and line managers have come to expect unrealistically perfect linear progress trends. However, students don’t work that way, and we need to retrain our leaders to ask the right questions about data so that our teachers will have the confidence to tell us the truth. At my school we decided to reject perfection and use only “real” data. A key part of that was stopping using “working at” grades.

What is a “working at” grade? Ask five teachers and you will probably get five different answers. Is it the grade which the student would achieve if they completed the course today? Is it the average of the work they are doing? Is it a holistic teacher judgement of their ability? There is no-one answer, which means it is very hard to challenge or confirm its accuracy.

In high-accountability systems, data like “working at” grades are susceptible to deflation at the start of the year and inflation at the end of the year, with little continuity between years. Some schools even have a rule that “working at” grades can never go down; they can stay the same or rise. However, learning isn’t filling up a bucket – sometimes students do backslide, and this needs to show up in the data so that we can act.

We no longer use “working at” grades, and we don’t miss them. We have real marks and grades for real assessments. They go up and down across the year, and that’s alright. What matters is that the calculated “rolling average” of all assessments still indicates that they are on track to achieve expectations.

Whereas in 2015 only 35 per cent of staff were confident that their predicted grades were consistent with coursework and mock grades, in 2016 this rose to 100 per cent. Now, 81 per cent of surveyed staff strongly agree that they analyse data effectively, and 90 per cent strongly agree that they could easily analyse their latest assessments to identify and address gaps in achievement.

This could only occur once staff felt that they were analysing real data in order to help them do their jobs, rather than to prepare justifications for Ofsted.


Now we are collecting real data, we have to ask ourselves how reliable it actually is. For example, who set and marked that mock paper? What criteria have been applied? Have the papers been moderated? Was it a highly scaffolded mock or a realistic exam-conditions mock? What grade boundaries have been applied?

Once time usually spent calculating percentages had been freed up by automation, we could all step back and have a good look at what the data actually represents. In my frequent meetings with heads of faculty, we made small but meaningful adjustments to markbooks as the year progressed. For example, instead of using the latest mock exam to calculate a rolling average grade, some subjects switched to using the average of the last three mocks. Over time, we tailored markbooks to individual subjects.

So did all of this translate into perfectly reliable data and accurate predictions? Our early entry predictions for English and maths were spot-on, with the exact percentage predicted achieving A* to C grades (and English having a significant year-on-year rise in results).

Our end-of-year results weren’t as impressively accurate in all subjects, though they were excellent in many. There’s still more to do to ensure that the data going into our system is accurate and meaningful, and that leaders in all areas are asking the right questions of their staff.

With that in mind, we have begun a new initiative to improve the reliability of data. Instead of teaching and then moderating work, we are front-loading the process.

Before any teaching takes place, we now meet in subject teams for standardisation. We share the assessment task, the assessment criteria, and our best evidence (past marked papers, coursework, exam board exemplars, materials from training courses) to establish the standard of work required. Once the team has a good shared idea of what is required, we can plan and deliver more precise and effective lessons. Then after teaching and marking, the team meets again briefly to moderate. This has been particularly useful for new courses at GCSE and A level where evidence is currently scarce.

In the end, we are actually collecting less data because we are more secure that the data is meaningful. Now, we only collect data which we are going to analyse and act upon.


I think that the more teachers can see data, the more teachers will work to improve it. In our automated system, every member of staff can see every mark every student is given, in every subject. This was a little confronting for some, but has been extremely useful. It puts staff in the driver’s seat, able to push forward with questions and actions, rather than waiting for others to send them information.

Heads of faculty quietly compare their results against other faculties and a healthy sense of competition has sprung up. Don’t get me wrong: we are not pitting maths against English, but it is illuminating to see how a student who struggles in your class achieved well elsewhere, and this spurs a number of useful conversations between colleagues.

Leaders of groups such as EBacc and Pupil Premium also have an instantaneous overview of how students are performing at the click of a button.

Our pastoral staff have taken on a much more active role in achievement, as they can keep up-to-date with their students’ data across subjects. Tutors and mentors now play a key role in preparing students for upcoming assessments, and foster peer support within their tutor groups.

Most importantly of all, line managers at all levels can see live data and ask questions about it at any time. This has facilitated a much more open and professionally challenging culture around data.

Much like the switch from “show lessons” to “typicality”, having your data permanently on display means our expectations of perfectly beautiful trends must be replaced with more realistic expectations; instead of preparing for infrequent and gruelling data meetings with senior leaders, our middle leaders expect each line meeting to involve some constructive dialogue about their data.

At the end of the day, the aim of openness is not to point fingers and shame staff; it is to direct resources and support to the areas most in need. This attitude needs to come from leaders when we discuss data.

Finally, with my heart in my mouth, I released parent and student log-ins to the system, so they could see their data as it is entered. We began with year 10 and 11 families. Fearing an avalanche of parent queries and complaints, instead I was thrilled by the students’ responses.

Students felt empowered when they could see how each assignment and mock added together to form an overall picture for each course. Many of them brought in updated drafts of coursework and engaged in useful conversations with teachers about their next steps. We’re now rolling this out for all year groups.

Data does not always present a beautiful picture, but by making it more widely available more people, students and teachers alike, can work to improve the outcomes it represents.


The vast majority of results across the academy improved – some dramatically. For example, English saw the proportion of students achieving A* to C rise almost 30 percentage points.

We are in a good position to improve further; 100 per cent of line managers now agree that they can access the data for the area they line manage quickly and easily, and 100 per cent agree that they can now challenge/confirm the accuracy of the predictions made for their area. With the old system, fewer than 40 per cent of staff agreed with these statements.

I thought it would be hard to win staff around to this new system. I put in a real effort to create a “burning platform” for change. We offered comprehensive training to ensure that everyone at every level was comfortable with the new tool, and sought feedback along the way. This feedback helped me to figure out when I needed to change tactics. The result has been that staff have taken to the system and use it with enthusiasm. The next step of the process is for staff members to run future CPD for each other; that’s planned for this year.

By itself no system will transform your school; you must change the hearts and minds of those using the system. If every member of staff wants and is able to identify the next learning challenge and go after it proactively, then your data system is working for you, and not the other way around.

  • Giselle Hobbs is assistant principal at The Stockwood Park Academy in Luton.

Future Leaders

Giselle is a participant on the Future Leaders leadership development programme. In November, the Future Leaders Trust is joining forces with Teaching Leaders to form one organisation tackling educational disadvantage through high-quality school leadership. Read more at


Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin