Best Practice

Ed-tech in your school: Following the evidence

Cranbrook Education Campus has put evidence at the heart of its approach for how its procures and uses of education technology. Head of campus Stephen Farmer offers his advice to other schools

In the words of physicist Carl Sagan, “extraordinary claims require extraordinary evidence”. When it comes to educational technologies, this is a mantra all school leaders should play on repeat. The primary consideration for schools purchasing ed-tech must be securing robust evidence.

In a normal week, I am approached by ed-tech suppliers on an almost daily basis. The variety of what is on offer is quite incredible. Rewind to the early weeks of the coronavirus shutdown and the cascade of free technologies on offer was as impressive as it was overwhelming.

The crisis has been a turning point for ed-tech. It has demonstrated the transformative role ed-tech can play and has increased the adoption of technologies at an unprecedented pace.

However, I am worried that with so much free tech on offer, the importance of assessing what ed-tech is right for learners has gone out the window. In the race to implement solutions to deal with such an extraordinary challenge, I can understand why the usual checks and balances might have been overshadowed.

As a self-proclaimed tech-lover, I am enthused by innovation and was keen to engage with technology in our response to the crisis. However, I was wary to jump on new technologies that could not meet our evidence-threshold. Both then, and now, I am wary of any supplier unable to offer evidence of impact. After all, free does not equal useful, futuristic does not mean impactful.

Having opened in 2015, Cranbrook has been able to incorporate technology from the start. Our approach has always been driven by evidence of demonstrable impact. Although enthused about the promise of technology, I am unapologetically sceptical about the claims some ed-techs make about their products.

In the words of Nesta, when it comes to ed-tech, “cycles of hype and disappointment are common features of the field” (Batty et al, 2019).

Late last year, a survey by Sparx – an ed-tech supplier with whom we work – found that more than half of teachers and school leaders do not trust claims made by ed-tech companies (Sparx, 2020).

I understand how hard it can be for schools to hack through the hype. We do not have the time or resource to undertake systematic, academic-level reviews of every solution that might answer our particular problem. Even when we seek the evidence, as Nesta asserts, “evidence is often missing, irrelevant, or hard to understand” (Batty et al, 2019).

Balancing a strategy that embraces innovation while also demanding the highest standards of evidence is challenging and there are no hard rules. However, there are frameworks that help us assess impact in a more systematic way. Here, in summary form, is our framework for schools seeking ed-tech evidence

Demand more from ed-techs

Let’s be clear – ed-techs must take the lead in ensuring robust evidence is easily available to schools. Initiatives such as Edtech Impact and the newly formed EdTech Evidence Group (see further information) are challenging providers to make the availability and transparency of evidence a priority.

This is a positive trend, but there is work to be done. There is a role for school leaders too though – and that is to ensure that they encourage teachers to ask the right questions, challenge suppliers and ensure they set an ed-tech strategy that places impact above all other considerations.

The basic questions we ask to start with when speaking to ed-tech suppliers include:

  • What is the impact?
  • How do you know?
  • Who can we speak to in a school about your product and its impact?
  • What staff training is included?
  • And – of course – how much is it?

Understand the value of the evidence you are looking at

According to the British Educational Suppliers Association, schools rely most heavily on peer recommendation to choose ed-tech products (BESA, 2017).

Certainly, one of the first steps we take when considering a new technology is to speak to other schools using it, and in turn we will be honest in our feedback to our peers when they ask about our experiences.

I am aware, however, that while recommendation is powerful, we should not rely on it alone. From in-house research, data analysis and systematic reviews there is a mixture of quantitative and qualitative indicators of evidence we should be considering. That is if these indicators exist – if they do not, that may be evidence in itself.

Consider co-design

It might be that a product is at an early stage of production and, as such, in-class research and analysis has not yet been undertaken. This is where you might consider whether your school can have a role in that research.

Such a collaborative project should not be undertaken lightly but can deliver real rewards. The benefit of a product that has been developed in collaboration with a school or wider trust is that the technology is not tested in isolation. In-school development gives the opportunity to test the pedagogy and implementation as well.

We have direct experience of this. For five years, we have worked with Sparx on the development of its maths learning platform. We have been instrumental to the evidence-gathering process and, as such, have had invaluable insight into how important such an approach is.

Being part of product development has allowed us to gather invaluable feedback from pupils, parents/carers and staff. Implementing standardised tests at the end of each year to properly assess the impact led to robust evidence of impact.

This approach has forced us to adapt how we work with other ed-tech companies to understand their methodologies and approaches to demonstrating impact. Our key focus is evidence of impact. We are not going to invest precious time, money and resources in a product that does not have clear, demonstrable impact.

Set your objectives first

While we must demand more of ed-techs themselves, as school leaders we too have a duty to measure the impact of our strategies. When it comes to ed-tech, how do you measure impact in your own classrooms? Stepping back further, if you do not know what you are seeking to achieve, how do you measure its success?

For example, at Cranbrook 48 per cent of our students are SEND and 44 per cent of our secondary students are eligible for Pupil Premium. One of the strategic aims we have for technology is to help close the attainment gap for disadvantaged learners and ensure they do not slip through the net. However, in doing so we are also mindful of the impact on teacher workload. Another key objective for us is to reduce the administrative burden on our teachers. We therefore must measure impact on teachers as well as on students.

Being able to measure impact post-implementation is as important as gathering evidence beforehand. It is key that any technology you implement has the mechanisms to help measure its own success.

Post-corona return to evidence

Effective technology is at the heart of Cranbrook Education Campus’ founding strategy. However, technology is not always our go-to solution for the challenges we encounter – only where impact can be evidenced.

Post-corona, I have no doubt that schools will be keen to capitalise on the technological advances they were forced to undertake. No matter how attractive a technology was in those days of crisis, I would encourage all schools to now take a step back from reactive relief and return to the evidence.

  • Stephen Farmer is head of campus at Cranbrook Education Campus in Exeter.

Further information & resources