Best Practice

Empowering every student: Creating inclusive classroom resources with AI

It is vital that our classroom resources reflect the diversity of our communities, but as more teachers turn to AI this goal is under threat. Teacher Ben Merritt looks at some AI pitfalls and recommended AI tools for creating classroom resources
Image: Adobe Stock - Adobe Stock

Imagine yourself in a classroom. You feel disconnected from the images, the stories, the subject content, the explanations, the videos. You feel invisible.

What if, through the power of AI, we could create a learning environment where every student feels seen, validated and inspired?

This is more than a technological advancement – this is a revolution in connecting with students and empowering them to be what they see.

But are we fully harnessing generative AI to drive this immense potential? And are AI platforms effective in making the most of this transformation?

 

AI mirrors its sources, not the world’s true diversity

I typed “An image of a strong, powerful climber climbing skilfully up a mountain” into Bing Image Creator, which uses the DALL-E model from OpenAI.

The first four results matched my description perfectly – only every climber was a young, white male. The same with the next four results. And the next. In fact, all 60 results in my daily free allowance return young, white males.

An AI tool’s output is only as good as its sources. Those sources often reflect the biases ingrained in society (Buolamwini & Gebru, 2018), and Bing Image Creator’s datasets are predominantly made up of largely westernised (Buolamwini & Gebru, 2018) publicly available images and stock image libraries (see Rosset, 2020).

 

Google’s attempt backfired

Aiming to rectify these biases, the morally virtuous Google trained Gemini’s image generator to “diversify depictions of all images with people to include descent and gender” (see Milmo & Hern, 2024).

This over-simplification, coupled with a woeful lack of testing, meant that Gemini became a laughing stock of the GenAI world as it dutifully and unironically churned out images of the US Founding Fathers, Second World War German soldiers, and even Vikings depicted as people of colour, many of them women (see Shamim, 2024).

At best, these images were historically inaccurate. At worst, images of black people wearing swastika-emblazoned Nazi uniforms were incredibly offensive.

 

It's on us

It would appear that AI image creators either fail to represent the world’s true diversity or fail when they try to rectify the problem.

One thing I have learnt from my school’s journey to achieving our next Anti-Racist School Award (from the Centre for Race, Educational and Decoloniality), is that it is not enough to be passively “not racist”, rather we need to be part of the change we want to see.

Regardless of your thoughts about whether AI image creators should have to manipulate their datasets, the fact remains that the moral obligation falls on the user to redress that balance.

We have a duty to ensure that we do not amplify pre-existing societal prejudices. We must actively and intentionally make our classrooms inclusive.

 

Force the issue, redress the balance

Seeing themselves represented in educational content positively affects students' self-esteem, academic performance, and sense of belonging (Ladson-Billings, 1995).

By harnessing GenAI to help us to create front-of-class resources, we have a golden opportunity to restore equity.

So, changing my earlier prompt slightly to “An image of a strong, powerful female Ugandan climber climbing skilfully up a mountain” results in a much more impactful picture which suddenly resonated with more students.

But of course we don’t stop at ethnicity. Consider other protected characteristics, such as disability, gender, sexuality, age, and strive to create a more diverse – a more accurate – picture of the world our young learners are growing up in.

 

Beyond images

As well as Bing Image Creator, I have also had success creating diverse images using Adobe Firefly, Flux, and Playground (see further information for links).

I would also include images created and then selected by other GenAI tools, such as the presentation creators Gamma and Curipod.

Diversifying our curriculum by providing more specific prompts for image creation is an easy win, and there are plenty more opportunities to represent a range of protected characteristics more accurately.

When generating anything text-based using, for example, ChatGPT, Copilot (which itself uses ChatGPT), or Gemini, we can prompt it to use modern Turkish names, or a common Sudanese family tradition, or a typical Peruvian dish, or a popular Buddhist celebration.

If you are yet to explore AI-generated songs, I urge you to try out tools such as Suno and Udio. What student does not love a bespoke song to help them remember key mathematical formulae, or the monarchs of the UK, or the present tense endings to “er” verbs in French?

A key advantage of these AI song-creation tools is that the outcome does not need to be flawless to be effective. Indeed, a naff song can enhance its value as a memorable and engaging learning tool. As with all things GenAI-related, the output is limited only by your imagination, so experiment with songs in the style of reggae or bhangra, or Afrobeat.

One of my go-to AI tools is HeyGen. Among other things, HeyGen allows you to upload a video of you speaking, which it transforms to seamlessly synchronise your lip movements to any one of 29 different languages while preserving your original voice. The potential for fostering inclusion with this tool is immense.

Imagine how much more welcomed and valued EAL families will feel when your “Welcome to our school” video goes out to them in their native tongue.

 

Navigating the pitfalls

We already know that GenAI relies on problematic and unrepresentative datasets which can result in content which unintentionally reinforces stereotypes. AI systems are trained on large datasets that can embed the very biases we seek to overcome.

Research from MIT (Turner Lee et al, 2019) and Harvard (Sweeney, 2013) has shown that GenAI models often replicate biases present in their training data, leading to outputs that can be misleading or harmful.

Be cautious of tokenism. Adding diverse elements to AI-generated content without context or depth can result in superficial representation. Tokenism can inadvertently reinforce stereotypes rather than challenge them. Instead, strive for meaningful and thorough integration of diverse characters and perspectives to enrich the material authentically.

And, as Google’s Gemini has already spectacularly proven, GenAI's attempt to diversify content can sometimes result in inaccuracies and even culturally insensitive images.

All of this shows that we must never assume that the content we get back from GenAI is accurate – always do your due diligence when presented with new information. As the experts in the classroom, we must retain the professional oversight of everything generated by AI (see Luckin et al, 2016). Without this we can find ourselves with content that misses the mark in terms of inclusivity and relevance.

With great potential to revolutionise education comes great responsibility. AI complements, rather than replaces, the nuanced human touch. For now.

 

Educate with integrity

When we take the time to create an inclusive classroom, we are proactively and collaboratively building a school culture that is proud to show that it values diversity and inclusion.

As well as making students from diverse backgrounds feel seen and valued, using more inclusive resources increases the cultural capital of everyone in the room by providing a more realistic representation of the world (Nieto, 2010).

Moreover, these inclusive resources can ignite valuable discussions in the classroom. When students see themselves and their cultures represented in the materials, they are more likely to engage and share their experiences, leading to richer, more meaningful conversations. These discussions not only enhance learning but also foster empathy and understanding among diverse backgrounds.

While AI is a powerful tool, it is evidently up to us as educators to guide its use. We cannot rely on AI alone to create the inclusive, diverse resources our students deserve.

By using these tools and adapting how we use them, we are not simply piquing the interest of a few extra students, rather we are representing the world more accurately for all.

It is difficult to underestimate how impactful a school’s resources, lessons, and subject content are. For nearly 1,000 hours per year, impressionable young people are presented with information that educators have carefully chosen to impart.

The power of AI lies not in the technology itself, but in how we choose to use it. It is a huge responsibility.

 

Further information & resources

 

References

  • Buolamwini & Gebru: Gender shades: Intersectional accuracy disparities in commercial gender classification, Proceedings of Machine Learning Research (81), 2018: https://tinyurl.com/ujnzz2sb 
  • Ladson-Billings: Toward a theory of culturally relevant pedagogy, American Educational Research Journal (32), 1995: https://buff.ly/4dDioZf 
  • Luckin et al: Intelligence unleashed: An argument for AI in education, Pearson, 2016: https://buff.ly/4dWfWwA 
  • Milmo & Hern: ‘We definitely messed up’: Why did Google AI tool make offensive historical images? The Guardian, 2024: https://buff.ly/49OHhPF 
  • Nieto: The Light in Their Eyes: Creating multicultural learning communities, Teachers College Press, 2010: https://buff.ly/3Mjm5Y5 
  • Rosset: Turing-NLG: A 17-billion-parameter language model by Microsoft, Microsoft Research Blog, 2020: https://buff.ly/2HdEvaU 
  • Shamim: Why Google’s AI tool was slammed for showing images of people of colour, Aljazeera, 2024: https://buff.ly/4g2jaR1 
  • Sweeney: Discrimination in online ad delivery, Social Science Research Network, 2013: https://buff.ly/3X43B2r 
  • Turner Lee, Resnick & Barton: Algorithmic bias detection and mitigation: Best practices and policies to reduce consumer harms, 2019: https://buff.ly/4cGln1I