Blogs

Social media: Harvesting young people's data and attention for profit

The dangers of social media are widespread and its negative influence on our young people cannot be denied. Daniel Kebede says enough is enough
Polling: New survey findings show that 83% of adults agree it is too easy for young people to get around social media age limits - Adobe Stock

The pernicious effects of social media on our children have gone on for far too long. It is becoming an increasing concern for parents and teachers alike.

Smartphones have accelerated access to harmful online content and the algorithms of web giants like Instagram and TikTok ensure that it reaches people faster than before. These companies enrich themselves at the cost of the majority. They are exacerbating the mental health crisis of our young people without any sense of social responsibility.

It cannot be disputed that excess screentime distracts from learning and has a negative impact on children’s education. Young minds are highly vulnerable to habit-forming behaviours that can lead to depression, anxiety and self-harm. It is no coincidence that the rise in social media over the past 20 years parallels a growth in poor mental health among young people.

The tokenistic efforts of social media companies to protect children are clearly not fit-for-purpose. As teachers, we are not naïve to the fact that children find a way to overcome age restrictions. Peer pressure is immense.

Recent polling conducted on behalf of the National Education Union by More in Common with the New Britain Project, showed that 83% of adults agreed it was too easy for young people to get around the age limit of 13; the same proportion are worried about children’s exposure to harmful content online.

I am not surprised. The dangers are widespread. To take just one example: a couple of years ago the children’s commissioner for England warned that Twitter – as it was known then – had become the most common place that young people see pornography (41%), followed by dedicated pornography sites (37%), Instagram (33%) and then Snapchat (32%). The average age of first exposure was 13.

The hold of social media has, for many people, sidelined the mainstream press from their lives. People turn to Instagram, TikTok and Facebook for entertainment – but it is also where they are informed.

For many it may be their only regular source of news. Combine that with the forced appearance of trending videos and promoted posts in people’s feeds, and is it any wonder that the spokespeople for misogyny, racism and conspiracy theories are in the ascendence?

Children must be empowered to go online safely. We cannot trust social media giants to dictate the content that they see. Schools should also be supported to teach children digital literacy and critical thinking. The on-going Curriculum and Assessment Review is a fantastic opportunity to embed this learning into all schools. 

As a father and a teacher, this issue concerns me gravely. I identify with parents worrying about what their child is viewing online. And it is hard for us to know exactly what they are consuming when it is all hidden behind an algorithm that even experts cannot decipher.

In our polling, 86% of the parents believe that raising the digital age of consent to 16 is now essential. Social media companies should be made to have social responsibility to our children. If they fail to do this, they should be punished for the harm that they are causing by being made to pay, with Ofcom granted additional powers to fine companies.

Driven by greed, these billion-dollar behemoths show a disregard for the wellbeing of our young people and have created a digital empire that preys upon their vulnerabilities, harvesting their data and attention for profit.

We have already seen the positive action that the Australian government has taken in banning social media for all under-16s. There needs to be stricter enforcement so that young people cannot get around restrictions and more research into how they are accessing harmful content.

If these platforms have the capability to target users with content, in order to keep them on the platform and please their advertisers, then they can just as easily identify dangerous content and take action on it.

Yet it suits their profit margin to do nothing. If they are not capable of self-regulation, then government should regulate to force them to take responsibility for protecting our children.