A lot of technology in schools does not perform as claimed or is inappropriate for the needs of teachers and students. Is this finally about to change? Professor Rose Luckin reflects on the direction of the government’s ed-tech strategy

The government’s announcement of the first ed-tech strategy earlier this year (DfE, 2019) began to pave the way for a new approach to technology in schools. It seemed that, for the first time, a secretary of state understood its relevance and the role it could play in aiding teaching and learning.

For those of us working in this burgeoning sector, the strategy – which was finally launched in the spring with the promise of funding worth £10 million – felt doubly significant.

In the first instance, the Department for Education (DfE) was sending out an important message that the entrepreneurs developing educational technology needed to up their game and ensure that their products and services were robust, valid and fit for purpose. It was an acknowledgement that ed-tech was, and is, here to stay.

But more importantly, perhaps, there was an expectation that schools and educators would engage with the process and have a role in shaping how this new educational landscape should look.

Since our launch in 2017, UCL EDUCATE has worked with more than 200 educational technology companies – ranging from built-in artificial intelligence chatbots able to provide real-time support to learners, to assessment tools that can evaluate progress over time and software that can address teaching and learning problems in virtually every subject in the curriculum, covering every age group.

We believe that industry, academia and schools need to work together and collaborate on developing high-quality technology that will positively impact its users.

To that end, we are developing EDUCATE for Schools as a resource to help teachers to share their practices with ed-tech in the classroom – helping to dispel some of the myths and fears about using ed-tech while pointing teachers and schools in the direction of what is more effective and purposeful.

From the outset we aimed to bring together technologists and educationalists within an educational research community to help companies develop ed-tech that is research-informed, effective and fit for purpose. As our cohorts progressed through the programme it became apparent that what was also needed was the input of schools into the ed-tech creation and development process – a view now clearly also embraced by the government.

A study published by the British Educational Suppliers Association (BESA) two years ago revealed that only 44 per cent of primary and 31 per cent of secondary schools in England reported that the educational technology they had implemented helped them to achieve what they set out to do (BESA, 2017).

The findings suggest that much of the technology being used in schools either does not perform the way the designers claim or is not appropriate for the schools’ needs. Our EDUCATE for Schools resource will support schools to be more discerning about the ed-tech that they choose to adopt.

As a research-based programme within a research institution, evidence forms the foundation of what UCL EDUCATE does. When we work with ed-tech companies, one of the first things we do is to teach them how to identify the different types of evidence that might influence how they design their product or service.

Similarly, it is important for teachers to understand the available evidence and to make informed decisions about what might be suitable for their needs. This might include anecdotal evidence, which is often delivered by word of mouth among colleagues or by another school, but which can be subjective.

Descriptive evidence, on the other hand, can be qualitative or quantitative and is useful in providing data about the characteristics of users of a product or the environment in which it is being used. This could help schools to identify technology that has worked in schools with similar student populations or contexts.

Correlational evidence, meanwhile, establishes that two (measurable) variables are related but one does not necessarily cause the other. For example, that time spent practising a skill leads to improved outcomes, such as learning to play the piano. This type of research is very common in educational research.

It is much harder to prove that a particular resource or pedagogy actually causes a particular outcome – something that experimental research evidence seeks to prove by pursuing randomised control trials. Proving causality is very difficult to establish in social science research as it is so difficult to isolate an intervention as the only variable that could have caused a particular outcome. This is because it is so hard to implement an intervention in exactly the same way in hundreds of classrooms!

Given there are so many different types of research, it is most likely that a mixed methods approach is needed to enable informed decisions, and this is true of ed-tech use in schools.

The first thing we advise schools to do, therefore, is to conduct a needs assessment, set alongside the school development plan, to identify whether it is, indeed, technology that would solve their needs or challenge – because it might not be.

Such an exercise best involves all relevant staff (including the technology lead), alongside senior leaders, and ought to focus on the school’s objectives rather than the technology itself.

After deciding its needs, we advise that undertaking an inventory of what technology is already in use or is taking up space on the school server alongside occasional surveys of staff to find out what capacity they have and how willing they are to adapt to new technology. This information will determine how much professional development might be needed before piloting and adapting ed-tech.

Armed with this information, schools should be in a better position to find the best product for their needs. A little research is needed to try to find out whether the product does what it claims to do and what the required conditions are for it to be most effective. Look carefully at any existing evidence relating to the impact of the products and be critical about its robustness. Was the school involved in any pilots of its own?

Trying to determine whether the findings presented in a company’s marketing materials are both reliable and relevant is not easy. Some ed-tech companies allow schools to test out their product before committing to buying, which requires the school to plan an effective pilot, with a sample of participating teachers and students. Our EDUCATE for Schools resource provides such a framework.

Within UCL EDUCATE, there is an expectation that the companies we work with carry out their own research on the efficacy of their products. Little Bridge for example, a global peer-to-peer learning community for primary-age children to learn English, now holds millions of data points and documented feedback from users around the world which allows it to demonstrate that children who are more “social” on its platform achieve better learning outcomes.

Another company, Across Cultures, which works with EAL learners, used internal and external assessment data to track pupil progress, and can now demonstrate that learners made progress in reading comprehension and decoding age.

Findings such as these give the companies leverage in attracting interest from teachers and schools and set them apart from the “snake oil salesmen” they are often warned about.

In devising EDUCATE for Schools, our research team worked with the Hammersmith Academy in west London. Head Gary Kynaston said the loss of the British Educational Communications and Technology Agency (Becta) in 2010 had left a vacuum in schools which need help and support in making decisions around technology use. He said that ed-tech was “a minefield and a whole different language” to that which schools were used to, so teachers needed a tool that would help them to ask the right questions about what technology to use.

He added: “A lot of schools have technology but it’s just sitting there and we’re not getting the best out of it.” At the same time, he warned against over-reliance on ed-tech or seeing it as a “panacea, a silver bullet that just doesn’t exist”.

“It is important to take a step back and consider carefully what technology to use and what to spend money on,” he said.

We hope the new secretary of state, Gavin Williamson – or indeed his successor should things change after the election on December 12 – will continue the commitment to fit for purpose, effective ed-tech initiated by the DfE earlier this year.

  • Rose Luckin is professor of learner centred design at the UCL Knowledge Lab in London, and director of the UCL EDUCATE programme.

Further information & resources