Has your CPD had an impact?


Ensuring your CPD strategies are effective and make a difference is vital. Sarah Coskeran offers some practical advice for evaluating the impact of three common types of CPD.

Evaluating the impact of CPD activities not only allows you to gauge whether or not your professional learning is having a positive impact on student outcomes in your classroom – it can also help to identify the particular elements that can be sustained and shared across school to encourage widespread impact in the longer term. There are some key principles to bear in mind:

  • Evaluating the impact of CPD on teacher practice is important, but focus primarily on student outcomes and the effect your practice has on these.

  • Evaluation starts before any activity takes place and continues as long as you maintain that particular change to your practice – helping you understand the impact of your changes and focus your efforts on sustaining these.

  • Using a variety of measures will offer a clear picture of what impact has been achieved and why: quantitative data could suggest changes in students’ attainment, for example, while qualitative data might allow you to understand which changes to practice facilitated these improvements.

We have chosen three example CPD activities and suggested some simple tools that you could use to consider the impact of your professional learning. 

Direct impact can be difficult to prove, but these tools will give an idea of the difference your CPD is making in the classroom. No single approach is definitive: these examples can be swapped around and adapted as best suits your own CPD. 

1, Evaluating the impact of external courses

One-day conferences or courses can, on their own, be limited in terms of transformational impact on your practice and student outcomes. However, the content provided can offer the starting point for sustained change and impact. Using some appropriate evaluation can help you target and maintain this. 

Student survey

Identify the group most likely to benefit from the content of the course or event. Put together a short survey for these students that relates to the particular areas the course will cover, to identify their learning needs in the relevant domains. 

Your survey can include open-ended or closed questions – or both. Before a course on feedback, for example, an open-ended question such as “What type of feedback do you find most useful for improving your learning and why?” might reveal nuances in different students’ approaches. 

Alternatively, for a classroom management skills course, a closed question will gather useful quantifiable data about the number of times students feel their learning has been disrupted in the past three lessons. Study the findings in the lead up to the course and have them in the forefront of your mind on the day itself.

After the course, having taken time to implement your learning from the day, revisit the survey you gave to students.

If your questions matched the course content well and are relevant to the changes you have subsequently made, ask your students to complete it again, making any small changes as necessary.

Compare the two data sets to identify any changes that have taken place, and consider the extent to which this matches your expectations.

If your survey does not reveal which changes to your practice have had the biggest impact on students, you might consider supplementing the survey with some short pupil interviews. As you sustain the changes to your practice over time, repeat the survey/interview process at intervals, to ensure your direct, positive impact is well-targeted at the desired areas.

Past papers and pupil recordings

If your course relates to a particular exam or course, use a past paper to take a baseline measure of learning needs. Ask students to complete the past paper in part or in whole, or – for greater insight – select some case study students to talk through how they might go about answering the questions. 

Use this to target your professional learning: if you are attending an A level English exam board course, for example, the question-level data from these past papers will help you identify where to focus your improvement. Repeat this process after the course to track students’ progress in not only attainment, but also attitudes and approaches to questions.

Attainment data

Supplement your qualitative findings with attainment data – either from standardised or in-class tests – to complete the picture.

2, Attending twilight or INSET sessions

The professional learning you engage in during an in-school training session is particularly easy to relate to your classroom context. Make the most of this opportunity by considering how you will evaluate the impact of the learning from the day.

Case study pupils

As early as possible, find out the topic of the upcoming INSET and discuss this with your line manager. Once you understand how the chosen area relates both to your classroom and whole-school improvement, select three case study pupils who are most likely to benefit from the session content. Take time to reflect: this may include students who do not instantly spring to mind.

Consider their attainment, behaviour and your professional understanding of their learning to build up a “profile” for each case study pupil in relation to the chosen area. Meet with colleagues who also teach these students to gather their reflections and input.

Consider approaching the pupils for a short, informal interview, which you can record and use to gently probe their attitudes to learning. These profiles will form your “baseline measure” ahead of the INSET session. 

During the session, keep these students firmly in mind and relate your learning back to the elements uncovered by your “profile-building”. 

After the training, put together an action plan of the changes you are going to introduce. As you do so, predict the impact of these changes on your three case study pupils. Make a note of these predictions for future reference.

At intervals, repeat the profile-building process for each case study pupil and compare against the initial profiles and the predictions you made after the training session. What impact has your professional learning had? Does this match the predictions you made? If not, why not? If yes, how can you successfully sustain this impact? What wider implications can you anticipate, extrapolating from your findings?

Use these questions to keep measuring, maintaining and maximising impact.

3, Coaching

Coaching can be an extremely powerful tool of professional development. Evaluating the impact of a coaching relationship will help staff understand how they can continue supporting one another most effectively.

Peer observation 

Observation is used in schools across the country to “evaluate” staff performance. However, using observations to grade practice against Ofsted levels is of limited value in terms of meaningfully developing and supporting staff. Such an approach can be particularly damaging to a coaching relationship, which should be based on trust, communication and support rather than judgement.

Yet when used well – and free from Ofsted gradings – peer observation can be an extremely powerful tool of reflection and evaluation.

Use peer observation from the very beginning of a coaching relationship, but not to comment on practice. A coach can provide a useful “second pair of eyes” to identify student learning needs within the coachee’s classroom.

These observations can then inform a discussion around which elements of practice could be developed to address these needs. As the coaching relationship develops, maintain this “student focus” in peer observations and triangulate it with data from other sources to track the impact of changes to practice on students.

Video technology can be a really useful tool in this process. Focus the camera on particular students and re-watch the footage together to track students’ reactions to new changes in practice. Some systems allow for real-time observation and in-ear coaching – use this to share instant feedback on which elements of classroom practice appear to best support the observed students. 

  • Sarah Coskeran is GoodCPDGuide programme manager at the Teacher Development Trust, an independent charity for teachers’ professional development.

Further information
Visit www.tdtrust.org to find out more about the Teacher Development Trust’s work, including the National Teacher Enquiry Network (see http://tdtrust.org/nten/) and the GoodCPDGuide (see http://goodcpdguide.com/).


Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin