Get the best experience in our app
Enjoy offline reading, category favourites, and instant updates - right from your pocket.

3 ways we tackle harmful social media content

Teaching students how social media algorithms work can change how they engage with what they’re shown, writes Tracey Neale
2nd March 2026, 5:00am

Share

3 ways we tackle harmful social media content

https://www.tes.com/magazine/leadership/compliance/3-ways-tackle-invidious-social-media-ideas
Coding on smart phone

Almost every child in a secondary school owns a smartphone, according to Ofcom. This gives them access to a rich world of knowledge, but can also expose them to the extreme views of a minority - whether in videos or the comments underneath.

Preventing young people from seeing harmful content - including online misogyny linked to figures such as Andrew Tate - is increasingly difficult.

Discussions about adopting a social media ban for under-16s in the UK are underway, and this could help to stem the tide. But how can we protect the current cohort from damaging influences, such as the new wave of digital gaming communities that incubate and champion misogynist attitudes?

Simply telling students not to listen or watch content is one way, but in my experience, it’s rarely effective. What we have found more impactful is teaching students how these platforms and their users actually work, and making them question what they are being shown.

1. Reveal the algorithm’s true purpose

When I first started talking to students about social media algorithms, I realised many had little knowledge of how they work. Several believed their feed showed what was popular or what everyone else was watching, rather than recognising how personalised it is.

I explained how platforms deliberately promote content that’s specific to each user, in order to keep them scrolling; the only aim is to keep them online to view more ads.

One particular UCL study usually gets them talking. It involved UCL creating a number of fake young male TikTok profiles and found that, after five days of typical use, the algorithm was presenting four times as many videos with misogynistic content as it had initially.

When students see how quickly content can shift, they are really surprised.

We also show how influencers can slowly draw them in: what starts as self-improvement, fitness or finance tips can gradually lead to content that frames women as manipulative, inferior or responsible for men’s failures.

Once students understand the mechanics, they become more sceptical, starting to reflect on how their own feeds have changed.

2. Train teachers to talk confidently

In talking about this, we realised that too many teachers lacked awareness of how algorithmic processes work and can become harmful. This led us to focus on digital literacy.

Indeed, research such as the Safer Scrolling report recommends a whole-school approach to a healthy digital diet.

For us, this has meant providing staff training so colleagues feel confident explaining algorithms or knowing the meaning behind an emoji.

When teachers can explain the possible significance of a hand gesture, such as one used by Tate and supporters, or the “red pill” emoji, often associated with misogynistic beliefs, I see students begin to unpick unhelpful opinions and question them.

3. Bust the myths

Training teachers in how to manage misconceptions is another facet of managing the issue. One thing we hear from students is the belief that many women make false allegations of rape.

Rather than dismissing these comments, we ask teachers to look at evidence with their students. In this case, we would look at research for the Home Office suggesting that only 4 per cent of cases of sexual violence reported to the UK police are found or suspected to be false.

I ask students to consider why the online perception feels so different from the data, and this gives them the confidence to challenge online rhetoric using evidence.

There is a lot that schools can do to help students understand more about the platforms they use, and how these provide airtime for extremists to feed their views to young people.

By opening their eyes, we can help to students distinguish between well-meaning advice and dangerous dogma.

Tracey Neale is an assistant headteacher at Ysgol Gyfun Cwm Rhymni in Caerphilly, Wales

You can now get the UK’s most-trusted source of education news in a mobile app. Get Tes magazine on iOS and on Android

Want to keep reading for free?

Register with Tes and you can read five free articles every month, plus you'll have access to our range of award-winning newsletters.

Register with Tes and you can read five free articles every month, plus you'll have access to our range of award-winning newsletters.

Keep reading for just £4.90 per month

/per month for 12 months

You've reached your limit of free articles this month. Subscribe for £4.90 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

You've reached your limit of free articles this month. Subscribe for £4.90 per month for three months and get:

  • Unlimited access to all Tes magazine content
  • Exclusive subscriber-only stories
  • Award-winning email newsletters

topics in this article

Recent
Most read
Most shared