Imagine the much-respected inspectors doing a survey of car handbrakes and reporting that 30 per cent were inadequate.
Keep imagining, as another survey by specialist inspectors found that 25 per cent of all gearboxes were ineffective. A further group of inspectors find that a third of the car engines they observed were unsatisfactory. Imagine what you would think if you were then told that general car inspectors had found more than 75 per cent of all cars were good or better.
If the specialists are highlighting significant problems, it seems hard to believe that the overall picture is good. There is either something wrong with the data, the inspectors or the process. We would probably be very tentative about driving a car described as good for fear that some serious inadequacies lurked beneath the surface – even though the individual car had the official quality mark.
That, though, is imagination; not reality. In schools, in 2013, specialist inspectors called HMI conducted a survey of PSHE in schools and reported that 40 per cent required improvement or were unsatisfactory. This followed on from another specialist team of HMIs recording that music was inadequate in schools to the tune of 20 per cent. They saw the gap between the best and the worst becoming “wider still”.
In other subject disciplines, specialist HMIs have found worrying signs. In art, craft and design 50 per cent of schools were graded below the acceptable level. In science, teaching was not good enough in a quarter of schools. In PE, the proportion of schools not good enough was 25 per cent. Consistently, these much-respected HMIs report that the proportion of schools in the sample they inspect is a cause for concern.
How does this level of concern square with the chief inspector’s much-vaunted figure of 78 per cent of schools being good or better when inspected by the generalist Ofsted teams?
Sometimes, when I am with friendly heads, I do my best Christopher Robin voice and whisper it quietly. The room goes eerily silent. Eventually someone proffers the view that the general Ofsted teams inspecting schools are looking generally whereas the HMI subject surveys are looking in depth. It goes quieter still when the room realises that what this means – we are saying that many so-called good schools are good on the surface, rather than in depth.
We know why it happens. High-stakes accountability driving performance management with league tables and inspection outcomes the significant reason for being. Dedicated specialist teachers long for the opportunity to become immersed in their discipline, exploring the poem, investigating the historical sources in depth, carrying out long-term analysis in chemistry, building a real business, or extending the expertise in gymnastics, or the use of tools and equipment.
Instead the pressure to “get on” with delivery and coverage drives teaching rather than learning, as youngsters are taken through the slalom of activity that will get the required “outcome” in the exam.
And results do matter; schools often open up to the local press on “results day” to have the whooping and hugging celebrated as students find out their grades. The results could be sent by email but this now traditional ceremony has become vital as a way for teachers and pupils to join together “after the game”.
Pupils rightly recognise that their hard work at trying to pass exams has paid off. Teachers rightly recognise that they have done their best at helping their pupils’ pass the exams. Both parties rightly recognise that they were in it together and are indebted to each other in different ways, often in spite of the pressure each put on the other at times during the past couple of years.
Sadly, though, numerous young people rejoice in never again having to “do” certain subjects and many teachers acknowledge that they did not fill the pupil with the joy of the subject discipline, but succeeded in simply taking them through a course of study with the right outcome. There is no disgrace in that, but many are left wondering whether they came into teaching simply to boost league table scores.
Lots of pupils later question why they did multiple GCSEs when they only seem to be asked if they got five. But these statistics matter because the data leads the inspection and the inspection outcome has such impact upon the work of the school and on jobs and careers.
It is a hard one to discuss. Given 78 per cent of us work in so-called good or better schools, why would we want to create waves and challenge the process? Having got the accolade and reduced the pressure on their school, who would complain?
Headline data, progress measures and average point scores become vital signs. Hence we are told that schools are getting better; yet the nagging feeling remains that all is not well – especially if we look under the surface.
Given the growing capacity of Ofsted to question data and find evidence, it is surprising that the contradiction between its analysis of school quality overall and its specialist inspection of component parts has not been drawn into question.
It is as though the quality of a car is judged on its odometer reading, the clarity of its windscreen, what the instruments on the dashboard say, and whether the accelerator works. What’s under the bonnet is another matter.
Further informationThe 21st Century Learning Alliance is a forum with representation from practitioners and industry that debates difficult issues to help stimulate improvement and change. Visit www.21stcenturylearningalliance.org or follow on Twitter @Learning_21C
Mick Waters is a professor of education at Wolverhampton University and member of 21st Century Learning Alliance. His latest book, Thinking Allowed on Schooling, was published in April.