Following a recent Ofsted inspection, The Helena Romanes School was judged ‘good’. Assistant headteacher Neal Foster explains the strategies they have for using data and supporting staff as they now aim for outstanding.

 

Last year, Ofsted awarded Helena Romanes School a “good” grade. It was the first time in a decade that we had been recognised as a good school and it marked an important milestone in our school improvement journey.

Prior to this point, we were a “requiring improvement” school. But does this mean we were plagued by truancy, inadequate teaching practices and poor exam results? Not at all, in my view we have always been a good school, we were simply unable to provide the inspection teams with the detailed information that they required about the progress we had made. This meant we needed to change some of our practices in the school. 

Out with the old

When I joined the school a few years ago as a maths teacher, I was unable to see the history of achievement of my students to help map their progress. Likewise, I could not tell if girls were doing better than boys, or if our English as an additional language students were exceeding expectations.

As a teacher you have a sixth sense about who is pulling their weight and who is not in class. Nevertheless, you cannot always be expected to spot that a quiet student has fallen behind slightly or if a gifted student could reach further.

A teacher could enter achievement data for their particular class in our school database and it would provide a value-added figure – but that was it. Analysing the progress of students involved a laborious, time-consuming process of extracting and manipulating data in an Excel spreadsheet. 

Out-of-date analysis

We analysed our exam results, but by the time we had completed the process, the analysis was out-of-date and related to students who had already left the school. Basically, it came when it was too late.

It became clear that we had to come up with a new approach to providing the senior leadership team, heads of faculty and individual teachers with up-to-date student progress data, especially as the aim was not to settle for good but to aim for outstanding.

We felt that better access to data would help us prepare for our next inspection. It would also ensure we could track student progress effectively in real time and step in while there was still time to make a difference.

But as always with a big change of working practices, the art in them being successful is in knowing where to start. The last thing we wanted was that our teachers would simply be recording data for data’s sake. We wanted to ensure that any information we input would help us do our job more effectively. 

Having completed a trial with a few classes, the system was deployed across the school giving all teachers access to assessment data and, more importantly, information about the progress of the students they were teaching. This was the first step towards an assessment system that would support teaching practice. However, middle and senior managers still had to navigate around several different reports and manipulate those reports to answer the questions that were being asked of them.

Helping hand

So we sought the advice of a team of management information system (MIS) consultants. With their help, we were able to design reports that extracted and manipulated the data into a form that could be easily used by middle and senior managers. Being given the information in a form that could answer their questions saved a great deal of time and effort on their behalf and allowed them to concentrate on designing intervention strategies to improve progress.

The report details how Pupil Premium students are performing against other groups or how girls are doing against boys. It also makes it very easy to spot which students or groups may be in need of interventions.

This information-rich environment led to us developing an assessment cycle which was designed to cross-reference the roles of different managers ensuring that no student was missed, whether they required pastoral or academic support.

Our heads of faculty are invited to attend an academic board meeting chaired by a director of learning and a member of the senior leadership team, where they outline the steps they plan to take to address any areas of concern in their subject areas. They are asked to return to a meeting later in the term to report on the outcome of any intervention.

We have also introduced a group that looks specifically at the performance of our students in English and maths using the data. They identify those who have, for example, four A* to C grades at GCSE, including just English – but not maths.

We then ensure the maths teachers know which students these are and develop strategies to support them to help them pass their exam. These meetings address all outcomes for students in terms of both attainment and progress.

Rapid interventions

Clearly speed is of the essence when it comes to having an impact on a student’s progress. The ability for heads of faculty to identify areas of concern enables interventions to take place immediately.

Yet it has also had an impact on a number of other critical areas. Analysis of teacher projections against actual results and the moderation built into the assessment cycle has resulted in a 35 per cent increase in the accuracy of our projections for individual students. 

Accuracy is critical, if we get it wrong we can hamper their progress, or worse still we may fail to intervene. 

Knowing what questions to ask

The other important step we took was to give teachers the tools to be able to understand what the data was telling them. All teachers have been issued with a single sheet of paper which has a list of questions that they need to ask themselves when looking at the information. 

They need to be answered as if an Ofsted inspector is in the building. We need to consider how different ability students are performing. 

For example, how are free school meals students getting on? Are students achieving the right grades? The list also serves to maintain a level of consistency among all teaching staff.

However, while the heads of faculty and individual teachers are able to instantly access analysed data, now we want to provide our senior leadership team with immediate access to whole-school data so that they can make more informed operational and strategic decisions. 

We want them to be able to look at the school’s performance against the measures we are judged against when Ofsted next drop by.

For everyone at Helena Romanes, good is simply not good enough. The work we are doing to improve the use of data across the school is already having a positive impact on the attainment of our students. 

Six years ago, approximately 50 per cent of students were receiving five A* to C grades at GCSE, including maths and English. Today, that figure is closer to 70 per cent. Following our most recent inspection, Ofsted praised us for our effective use of data and we believe it is playing a critical role in moving us closer to our goal of becoming an outstanding school.SecEd 

  • Neal Foster is assistant headteacher of The Helena Romanes School which uses the SIMS School Improvement Programme.

CAPTION: Data driven: Two students at Helena Romanes School in Essex