The arrival of the unvalidated RAISEonline data at the beginning of December prompted me to think about how we analyse our exam results data and, critically, how the exam results are communicated outside the school.
The results a school achieves are likely to be at, or at least very near, the top of prospective parents’ list of factors to consider when selecting a school. Therefore, how we report results to parents and others is critical.
To students sitting exams, and their parents, “results” is that day in August when they get the piece of paper with their final grades. This is fair enough – clearly no 16 or 18-year-old is going to be too concerned about the overall performance of the school, or how that compares to previous years or the school’s predicted results.
However, some of the difficulties a school may have in projecting the right message about their results inevitably follow from this concentration on one or two days in August. No newspaper can appear to resist the opportunity to publish a picture of delighted girls (it is almost always girls) receiving their results.
GCSE and A level results have become big news and receive massive coverage, both nationally and in the local press. And it is at this time that the local press will print, alongside the myriad of pictures of happy teenagers, the local schools’ “results” based on the usual performance measures (five A* to Cs etc).
The problem we then face is that, as far as coverage of a school’s results in the press goes, that is it. The local paper files away the numerous photographs of excited teenagers and makes a note to repeat the exercise in 12 months. From a school’s point of view we only have this one opportunity to publicise our results and get the message out about our performance. Parents’ and prospective parents’ perceptions of a school’s performance are largely formed in the immediate aftermath of the results days in August each year.
Unfortunately, what appears in the paper in August does not begin to tell the whole story about a school’s performance. All it provides is a very rough and ready indicator. We are to trying to communicate a critical message about results without having all the necessary data. What is lacking, of course, in the August results jamboree is any indication of the value a school adds.
This is where RAISE data comes in. Unfortunately, by the time the RAISE data is available, that particular ship has sailed. I can’t envisage a local paper eagerly awaiting the RAISE data to publish in a special pre-Christmas supplement. In addition, it is not as easy to draw simple conclusions from the RAISE data – it is much easier to tell a story based on one percentage figure per school.
The RAISE data is critical in providing the necessary information to allow a school to assess its overall performance properly and identify exactly which areas performed well and which did not. While a school can undertake some analysis of results based on expected performance, it is only once the RAISE data is available that a school can obtain the national perspective. For example, what appeared to be a significant drop in English results this year, may, once RAISE data is examined, actually have been an improved performance. However, the fact that the RAISE data is not available until December makes the already challenging task of communication to the wider community even harder.
Although clearly RAISE data is primarily for school’s internal consumption, I think that it is important that schools do try to communicate some of the subtleties contained within the data to parents in order to give them as complete a picture as possible about a school’s overall performance.
Diary of a headteacher is written anonymously and in rotation by three practising headteachers from schools across the country.