News

Social media companies have 'spent too long ducking responsibility for the content they host'

Social media companies have for too long “ducked their responsibilities” for the content they host online and must be more “proactive” in protecting the children whose lives are now dominated by their services.

School leaders and the wider education community have welcomed the government’s plans, unveiled last week, to give social media firms a legal requirement to protect their users.

The Online Harms White Paper aims to address issues such as inciting violence, violent content, encouraging suicide, disinformation, cyber-bullying and children accessing inappropriate material.

It sets out plans to create an independent regulator and a mandatory “duty of care” for social media firms, and promises “tough penalties” for firms that do not comply with the requirements.

The government states: “This will include a mandatory ‘duty of care’, which will require companies to take reasonable steps to keep their users safe and tackle illegal and harmful activity on their services. The regulator will have effective enforcement tools, and we are consulting on powers to issue substantial fines, block access to sites and potentially to impose liability on individual members of senior management.”

A 12-week consultation has now been launched on the plans, which have been welcomed by both the NSPCC and the children’s commissioner for England, Anne Longfield.

NSPCC CEO Peter Wanless said: “For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content. So it is high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.”

Ms Longfield added: “The social media companies have spent too long ducking responsibility for the content they host online and ensuring those using their apps are the appropriate age. The introduction of a statutory duty of care is very welcome and something I have long been calling for. It should now be implemented as quickly as possible so that all platforms, no matter their size, are held accountable.

“Any new regulator must have bite. Companies who fail in their responsibilities must face both significant financial penalties and a duty to publicly apologise for their actions and set out how they will prevent mistakes happening in the future.

“The internet wasn’t designed with children in mind, but they are among its biggest users. Social media platforms dominate aspects of their lives in a way that could never have been imagined 30 years ago. With this power must come responsibility – and it can’t come soon enough.”

Paul Whiteman, general secretary of the National Association of Head Teachers, also backed the statutory duty of care proposal. He added: “Social media companies need to be more proactive. They need to be on the ball looking for material and have a clearer line on what is and isn’t acceptable, particularly where children and young people are concerned.

“Social media providers should take down not only illegal content but also legal material that could be harmful. These companies need to ask themselves: ‘Could this content cause harm to children or young people?’ If the answer is yes, then the content needs to come down.”

Related articles