Case study: Developing a revision cycle for pupils

Written by: Iain Wilson | Published:
Download Supplement
Image: Adobe Stock

Wilson’s Revision Cycle was developed after a trial period involving students and staff at Hindley High School. Iain Wilson explains the research, evidence and process behind this work

In November 2016, I wrote in SecEd about ways to get a team on board with a new initiative, providing advice for during and after the implementation of any notable change (see further reading).

One of my most important pieces of advice was to make sure that any initiative is made personal, useful and, where possible, is a clear time-saver. Also, crucially, we must share the outcomes – no matter what.

Using this approach we have built a team at Hindley High School that is working towards a common goal and which is keen to share our research nationally and internationally.

It was this ethos that led to recent research on the Wilson’s Revision Cycle winning the Chartered College of Teaching’s 2018 key stage 4 and 5 research poster competition (you can download the winning poster via the download button above).

In this article, I will focus on what Wilson’s Revision Cycle is, how it has been developed, implemented and evaluated, the impact, and the pitfalls and challenges we faced.

The diagram above outlines the revision cycle. It is an experiential learning method that is designed to improve a pupil’s self-regulation of work while embedding concepts and knowledge in an ordered way in the long-term memory, rather than relying on short-term memory.

It is a tool, developed over 10 years by myself and refined during Ambition School Leadership’s Teaching Leaders Programme, that is designed to improve the self-regulatory and divergent nature of learning by using a “chunking” method (see poster download for more detail).

The techniques are based on a revisiting or interleaved format of learning, providing opportunities for the learner to reflect on their learning, encourage the identification of misconceptions and weaknesses, and be supported to overcome them.

Development and implementation

First, it is key to recognise that it is all about the students – so we should be involving them from the outset. Building the revision cycle took time and experimentation. We had to look at things from the students’ perspective, rather than our own experiences, and we drew upon the success of an interleaved metacognitive approach to learning that already existed in many classrooms.

Involving students in the process can be a leap of faith – however, it is one of the most crucial things in making this a personal approach.
In the early development of the revision cycle we needed to determine which techniques of revision were effective in enabling students to retain more information over time, therefore achieving more highly than comparable students in assessments. This always involved student voice, and frequently involved one-on-one feedback sessions with individual pupils.

Crucially, it was more about listening to the pupils than it was about directing them. We had to allow them to explain the techniques they had used and, importantly, how they had used them.

We monitored sections of assessments with a small case study of classes and the techniques they used, pulling out techniques that the students found to be effective and promoting them through the expertise of staff, while eliminating techniques that consistently produced negative outcomes.

It is important that staff understand which data to monitor and how to assess the success of techniques – and this will be unique to the initiative you run.

As a leader in this process, using the research on metacognitive approaches was also key to ensuring staff buy-in and the project’s success. My advice is to become an expert – absorb as much as you can of the research around the area and then figure out how this applies to your context.

No research can be transplanted directly, context must be taken into account: context of the school, the community it serves, and the students and the staff with which you are going to run the initiative.

Use the advice available from the Education Endowment Foundation’s Teaching and Learning Toolkit (dig into the original research pieces, not just the meta-analyses).

Moderation

Within our team, we worked together on understanding what we expected from our students and one of the key components of the trial of the revision cycle was moderation of the work that students produced.

We evaluated the effort that students put in to the revision technique of their choice and advised students on the effectiveness for them of the technique they used. We gave the students choice, guided them to techniques that were effective for the individual and monitored the improvement in internal assessments.

We did encounter some hiccups along the way. Data input from staff into the tracking systems and appropriate interleaving (rather than massing) of revision were both areas where intervention was required by the project leader.

Both of these areas required careful conversations, but both issues seemed to centre on staff misunderstanding of research processes and the evidence-base behind them.

First, by talking with staff about the requirements in order to achieve a robust piece of research, they saw the need for reliable data in order to prove impact, either positive or negative. When this understanding was secure, and staff also understood the possible impact on their future practice and pupil outcomes, it became more personal and important for them to ensure that the data was entered accurately and in a timely manner.

Second, conversations with staff on the theory of interleaving over massing approaches to learning and revision clarified and reinforced the reasoning behind the project.

Impact?

When I last wrote on this subject I stated that, “the jury is still out” when it comes to the impact of this revision cycle. Sharing the outcome of a trial is hugely important – whether there has been positive or negative impact on outcomes.

In the case of Wilson’s Revision Cycle, there was a significant positive outcome with the cohort in 2017. We used a matching approach (Coe et al. 2013) to prove impact and found that the average difference in the UMS (uniform mark scheme) per student meeting or exceeding teacher expectations on the cycle was +37.71. The difference for those performing under teacher expectation on the task was -0.96.

There was also a significant increase in the overall average UMS, with an average increase of +13.9.

The effect size for the cohort was 0.264 and for those meeting or exceeding expectations for effort it was 0.932 – almost a complete year of schooling (for a deeper understanding of effect size I recommend Coe’s 2002 paper, It’s the Effect Size, Stupid).

Next steps

Through the approaches we have taken we are now looking at other research projects and analysis of these projects, with a further three within the department and two beyond the department currently on-going.

If this approach to new initiatives is worked correctly and staff are involved fully, they will begin to take ownership and leadership of projects themselves and evidence will be accrued showing the steps we are taking to improve student outcomes.

As we do this, and share more within the educational community, we can improve outcomes for all students by sharing what works, the evidence and theory behind it, adapting it, evaluating the localised context-driven impact, and then doing more of what works by having a deeper understanding of why strategies are successful and also why they are not.

  • Iain Wilson is lead practitioner for metacognition, self-regulation and resilience at Hindley High School in Wigan, a former member of the European Chemistry Thematic Network and an Ambition School Leadership Teaching Leaders alumnus. As well as winning the Chartered College of Teaching’s Research competition, the work on Wilson’s Revision Cycle was acknowledged by the EEF for the quality of the research question, the depth of engagement with research and the approach to evaluation. Follow him on Twitter @Linainiwos

References & reading

  • Middle leadership: Getting your team on board, Iain Wilson, SecEd, November 2016: http://bit.ly/2pwrz7e
  • The DIY Evaluation Guide, Coe, Kline, Neville & Coleman, EEF, 2013.
  • It’s the Effect Size, Stupid: What effect size is and why it is important, Robert Coe, 2002 (Annual Conference of the British Educational Research Association): http://bit.ly/2QUhgqa
  • Wilson’s Revision Cycle, 2018, Iain Wilson, Chartered College of Teaching Annual Conference: http://bit.ly/2I8w7Jv


Comments
Name
 
Email
 
Comments
 

Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
 
Claim Free Subscription