
Safeguarding work, and wider pastoral work in general, can sometimes feel like the poor relation in education when it comes to evidencing need and impact through data.
While spreadsheets are the domain of many of our curriculum leader colleagues, I don’t believe that such systems are able to reflect the breadth of the quality work designated safeguarding leads (DSLs) and their teams do to keep the children in our settings safe – be it the endless phone calls to another agency to get the information needed to assess whether a child can return home that evening, the discussions with parents in financial difficulty on how best to support their child, or the staff training on specific emerging safeguarding issues.
All the work of a DSL and their team is crucial and is sometimes quite literally life-saving. The problem comes, however, when we need to evidence our successes or sometimes evidence need in order to get external support.
While figures and statistics can show a child has improved their learning journey over time, as safeguarding professionals it can be hard to neatly capture the improvement an emotionally neglected child has made in self-belief through the early help support put in place by a dedicated and skilled learning assistant, for example.
Even where we do have more “traditional” data – perhaps the numbers of children for whom child protection referrals have been made or numbers on a Child In Need Plan – this is still limited in its usefulness as, depending on your setting’s context, you might have larger or fewer numbers and of course a poorly executed “plan” does not improve the lived experience of a child in any case.
Quantitative data
Quantitative data is the more traditional way we evidence work. It gives a numerical value to data. We can use it in safeguarding and it can be useful.
For example, we can give our percentage of students receiving free school meals and those receiving any type of Pupil Premium support, which can help give crucial context to some of the challenges our setting might face.
We might use local health prevalence data to look at percentages of children who are obese, to discuss potential types of parental neglect, or to demonstrate prevalence of substance misuse in a community to help develop a relevant safeguarding response in the curriculum offer.
The potential problem is that sometimes quantitative data can be used in a misleading or incomplete way.
For example, if we confidently reported a high number of cases that had been referred to our local Multi Agency Safeguarding Hub (MASH) team. This data could be used to demonstrate “excellent” practice – staff confidently noticing signs and symptoms of abuse and neglect or a culture where children feel safe enough to disclose.
However, as with any data, it can hide a multitude of issues. If in practice many of these referrals have been deemed not to have “met the threshold”, then the practice isn’t excellent – far from it as it demonstrates a clear support need to ensure consistency and accuracy among the safeguarding team.
Either there is a training need for DSLs to complete forms in a more complete and clear way, or there is a lack of understanding of the local area threshold document.
Qualitative data
Qualitative data is the non-numerical data we have in our settings. In terms of safeguarding this is where we generally hold our more meaningful data. The problem is that the information we hold about children in this regard is often not as tangible.
How do we measure improvement of this type in a way that is meaningful for inter-agency colleagues? It’s the age-old question: how do I know that what I meant is what you understood?
One thing to avoid is the use of “educational jargon” as this won’t necessarily be useful to colleagues in a different sector. Use plain English wherever possible.
Ask for clarity in reports if unclear – otherwise our data isn’t as rich as it could be. If sticking to plain English somehow is not possible, use phrases from the local threshold document to justify your concerns – this will mean it is harder for colleagues in the MASH to reject the referral as it is really clear, and we are repeating their language (they will link closely with the Local Safeguarding Children Partnership who create the threshold layers of need) back to demonstrate levels of concern for a child.
Also, be as detailed as possible in the parts of the referral form which ask for specifics – dates, times, witnesses, use the words of the child where possible and so on.
In terms of describing a noticed change in a child, again be specific. If a child’s communication and emotional regulation has improved, then detail how and why. What evidence is there? Do you have an example of something that they used to find challenging but have recently accomplished? Be descriptive and precise – especially if we are detailing a decline in skills and attitudes. That is rich data.
If you are looking at the safeguarding culture and ethos in a school, what qualitative data have you gathered? Have you led student focus groups, and if so, have you worked with a variety of groups of students – mixed ages, mixed academic ability groups, mixed ethnicity, with louder and quieter students?
Have you asked children to document how safe they feel in your settings. One example of this would be to ask children to RAG-rate (where red feels unsafe, amber feels sometimes safe, and green feels safe) their school setting around how physically and emotionally safe different areas feel, followed by a discussion of reasons why certain areas feel more or less safe.
A reason often given for areas feeling less safe in unstructured time is that staff are not “on-duty” in the correct area promptly, allowing for incidents of bullying to occur as an area is unsupervised.
Using the right data
Both quantitative and qualitative data have their place and can contribute value. The problem comes when the wrong one is used to try to justify action or inaction.
The key is understanding what information is needed to best support a child and which format is going to help you to get any inter-agency support that is needed.
Quantitative data can be useful to show a child’s academic progress has been limited, for example, due to a known or suspected safeguarding issue. Qualitative data around the same issue would, however, give a potential explanation for types of behaviour demonstrated and potential support that could be effective.
Finally, you may find it useful to list your sources of information about a child’s wellbeing and progress – possibly using a staff meeting so all colleagues consider the type of information they can use to evidence concerns about a child.
- Jo Perrin is an experienced education adviser at Services For Education and interim lead of the team of School Support Education Advisers. She has previously held roles as a designated safeguarding lead and pastoral lead in the education sector and has a wealth of experience in teaching PSHE and expertise in childhood trauma from her time as a foster carer. SFE offers training, audits and support for DSLs and school staff. Visit www.servicesforeducation.co.uk/safeguarding