Monitoring: Keeping up-to-speed

Written by: Chris Parr | Published:
Sponsored
Image: Adobe Stock

The risks and threats to student wellbeing and safety are changing all the time, including the vocabulary associated with those risks. Chris Parr speaks to one digital monitoring company about how they keep up

On the face of it, digital safeguarding in schools seems very straightforward: monitor what is said or viewed by pupils online and when something questionable is flagged up, deal with it accordingly.

Indeed, most schools will have policies and processes in place designed to respond quickly and appropriately when red flags are raised. After all, recently updated Department for Education (DfE) statutory guidance, Keeping Children Safe in Education, states explicitly that it is essential that children at school are safeguarded from potentially harmful and inappropriate online material.

The use of technology has become a “significant component of many safeguarding issues”, the DfE says. According to the guidance, this includes cases of sexual exploitation, radicalisation, and sexual predation.

“Technology often provides the platform that facilitates harm,” the DfE document states. “An effective approach to online safety empowers a school to protect and educate the whole-school community in their use of technology and establishes mechanisms to identify, intervene in, and escalate any incident where appropriate.”

However, keeping track of what pupils are doing online and what their actions and interactions might mean from a safeguarding perspective is no mean feat. How, for example, can a teacher or safeguarding lead keep up-to-date with all the latest euphemisms for a particular drug? How can they monitor every email sent between pupils, the sites they visit, videos they watch, documents they download from USBs, social media exchanges and so on for possible threats or red flags that might indicate something is amiss?

This is one of the problems that eSafe, a safeguarding company that provides digital monitoring solutions to schools, has been addressing since 2009. And it is not an easy task.

“Schools are expected to safeguard against a huge range of different things these days,” explained Mark Donkersley, eSafe’s CEO. “They include possible cases of child abuse, radicalisation, early sexualisation, self-harm, female genital mutilation as well as mental health issues.

“A lot of schools will feel that they have very effective processes in place, but no-one can realistically expect them to be experts in all of these areas.”

An insight into this problem can be found within eSafe’s team of behaviour analysts. These specialists are the ones who look into all of the warning signs flagged up by the eSafe system through its monitoring of online and offline digital activity within a school, even including words typed by pupils that are never saved as documents or sent as emails.

These analysts are charged with assessing the red flags that are raised and determining whether or not any further action is required. It is their knowledge that determines whether a seemingly innocuous exchange is indicative of something more serious – or, indeed, if an ostensibly worrying incident is actually nothing to be overly concerned about.

Typically, analysts are educated to degree level or higher in a relevant discipline, such as psychology or criminology. They are often multilingual, and have practical experience of working with children and young people.

Penny is one of the analysts (her name has been changed to protect her identity): “Before I became an analyst, I worked in social care and education for more than 13 years in a variety of different roles, working directly with children and young adults with emotional and behavioural problems,” she said. “That’s given me a real understanding of young people – their behaviour and how they respond in different situations.”

The behaviour analysts come from a range of different backgrounds, and include former teachers, mental health and social care specialists, as well as others with experience of working with vulnerable young people. They receive on-going training to recognise the most subtle markers of safeguarding risks across the range of behaviours and ensure the correct processes are followed when potentially dangerous behaviour is flagged.

“When we do receive notification of an incident, we are able to determine its severity level,” Penny explained. “While software on its own might be able to detect a possible threat, the analyst can put it into context to help the school take appropriate action.”

It is this human aspect of digital monitoring that can prove vital to catch problems early. There is still however, a key role for artificial intelligence. While the technology cannot determine the level of risk, it looks for terms, contextual content and patterns of activity that indicate a potential issue with a young person’s welfare and wellbeing.

The big challenge here, though, is ensuring that the database of words and phrases that the technology is looking for is kept up-to-date.

“With certain kinds of behaviours we are looking to pick up on, there is clearly an inclination to try and disguise them,” Mr Donkersley continued. “It is human nature. Those involved in selling or buying drugs, for example, are constantly coming up with new euphemisms to disguise what they are doing.”

He gives the example of the phrase “a wonderful bath” (which refers to synthetic cannabis), as a term that has been used to avoid detection. “We have to stay on top of these developments,” he added.

Ensuring that the technology is up-to-date is the job of eSafe’s InsightLab. Terms and phrases associated with behaviours and conditions as diverse as mental health, substance abuse, grooming, extremism, bullying and suicide risk are dynamic and ever-changing. They can become obsolete very rapidly and to prevent the monitoring function becoming redundant, it is critical that the detection technology is maintained with current markers.

On-going research as well as input from schools, along with information from agencies such as the police and others involved in child safeguarding, helps the InsightLab to constantly update its database of potentially indicative language.

For example, the popular online game called the Doki Doki Literature Club (DDLC) is considered to be a risk to young people who are emotionally vulnerable or have existing mental health concerns. When DDLC first emerged in September 2017 the InsightLab was quick to look at and establish the possible phrases and associated risks with the game that could indicate safeguarding dangers.

In one case eSafe picked up, a user typed the word “daijobu” multiple times into a script during coding class. While this is a Japanese term that literally translates to “I’m okay”, DDLC players use the phrase to indicate the opposite. This example indicates a user who’s gone “off task” and is using the lesson to “download” their thoughts and feelings. It is a sign of negative behaviour and is potentially linked to mental health problems.

“The InsightLab is a team that is interrogating trends and themes that are occurring within the digital environment in order to understand what’s happening out there with young people,” explained Josh, a researcher in the team.

“What essentially happens is that a particular word or phrase is identified as a marker of a particular safeguarding risk and we work to understand all of the structures around it. It isn’t enough to just know the main words and phrases, we also look at the breadcrumbs – all the language that is associated with it that might help us to identify possible risks.”
It is this blend of intelligent software detection and nuanced human interpretation that is key to identifying the warning signs effectively, Mr Donkersley added.

He continued: “We use machine learning and artificial intelligence to bring potential concerns to the surface, but it is then down to our behaviour analysts to apply their in-depth knowledge and actually determine whether further action is needed.
“That human part of the process is very hard to automate.”

  • Chris Parr is a freelance education journalist.

Further information

Sponsored Content

This article has been published by SecEd with sponsorship from eSafe. It was written and produced to a brief agreed in advance with eSafe.


Comments
Name
 
Email
 
Comments
 

Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
 
Sign up SecEd Bulletin