Case study: Creating a post-levels solution

Written by: Martin Smith | Published:
Post-levels: Students at Darrick Wood School, which has developed its own post-levels assessment solution

Many schools struggled to know how to replace curriculum levels. Darrick Wood School was among those to design a new system from scratch. Martin Smith reflects on this process and effective assessment at key stage 3

From September 2014, the old system of attainment levels was removed in a bid to allow teachers greater flexibility in the way that they plan and assess pupils’ learning to meet the end of key stage expectations within the new national curriculum.

However, the new curriculum still requires an assessment system to check what pupils have learned and whether they are on track to meet expectations (and to report regularly to parents).

As such, the demise of key stage 3 attainment levels was greeted with huge reticence from schools and created a real sense of panic among many teachers about being told to go away and develop their own assessment methods without guidance.

Schools were effectively left with three options on the table: stick with levels for as long as possible, prepare to purchase an “off the shelf” system that was being built by a third party, or devise an entirely new system from scratch.

At Darrick Wood School, driven by our forward-thinking local authority, we formed an initial working party with other schools to exchange ideas and come up with a solution – this set the wheels in motion for the creation of our own system.

Our system is a simple grid for each subject and a progressive set of attainment targets that present challenge at all levels of ability throughout key stage 3.

The grids are broken down into a template of nine steps across four, five or six different subject strands. This level of detail means pupils can make fine levels of progress and teachers can create incremental, personalised targets based on assessment.

So, based on my experience, this is the advice I would give to other schools who are still trying to create or perfect their own solution.

The first key question to consider is whether you wish to follow an in-house approach to building a personalised model or adopt materials and systems from a third-party provider. Adoption has many advantages. It is quick, available and to some extent, proven.

If time is at a premium and a system is required almost immediately then there is really no substitute. The infrastructure will still need to be built, but with the right people and the guidance contained in this material it should not be too arduous a task.

However, there are clearly many advantages to a DIY model. When done correctly there is ownership and a determination for it to succeed which permeates throughout a large cohort, eventually reaching most staff.

With such a distinct change in direction for lower school assessment there is an immediate addition to the workload – not least because of the depth and extent of the new style assessment, but also simply because it is new. With ownership, though, comes a willingness to make things happen, a readiness to take on that additional workload, and an enthusiasm in the end product.

The second key question is your choice of development partner in school. As I started very much with a blank canvas, considerable time and a very broad – if somewhat challenging – remit, I looked around very carefully for the right partner.

After some consideration, I approached a physics teacher who had some responsibility within the department for key stage 3.

Crucially, he was well-respected by the teaching body. This is where the “public relations” exercise starts; with the right person on-board you will be well on the way to success and a smooth implementation of your system.

Within days of presenting our first attempt at the system to the school’s steering group, I had further meetings with two members of staff who were part of the group and who represented geography and PE.

Very rapidly I saw the benefits of this fortuitous expansion to my development team, and as such I would recommend it as a model for consideration.

While not wishing to exclude anyone from the process, it is extremely useful to have a small team in the initial stages, particularly one with representatives from a core subject, an academic foundation subject and a practical foundation subject.

The third key issue is what you want from your steering group. All major projects need a group like this to maintain focus and temper input in the light of the big picture. Ultimately, the steering group has accountability and it goes some way to ensuring the overall success of a project, particularly when its members are representatives from each of the subjects delivered in key stage 3.

Their positive input in departmental meetings was essential to strengthen the link between each area and the whole-school plan.
Other things to consider in a group like this should be the balance between experience and youthful vibrancy as well as the need for a level of understanding of the technical aspects of the process. The inclusion of staff with the skills to translate the ideas into reality is a major asset.

I went with a large group – my youngest member was 24 and oldest 58, with everything from three to 35 years’ teaching experience – but worked closely with my three development partners in between meetings to have practical examples with which to guide the rest of the group through successive stages in the process.

The fourth key question is around target-setting. It seemed very obvious to me from the outset that the task in front of us was vast: to design and implement a totally new system of monitoring attainment and progress in key stage 3, make it workable and accessible to all stakeholders, feasible and accurate was, frankly, daunting. I knew I had to roll-out my plans stage by stage.

Knowing just how much detail to give, how much guidance to provide and how much to challenge was a tough task given the variety of the teams. This was always going to be a 12-month project initially, but the “end-game” of the project and the processes by which we secured its validation was somewhat hazy even to me.

Understanding the need for workable and achievable stages in meeting your targets should, you would assume, be blatantly obvious to someone who has designed a project like ours where this is the basis of the system. In reality, the breakdown of the somewhat unknown is considerably harder than the division of a programme of study, and I had to adapt and be flexible in my plans as we went along.

The fifth key question is around balance, variation and consistency. Any assessment system ever implemented has suffered from some degree of inconsistency between subjects. It is an incredibly – near impossible – task to maintain perfect parity between subjects with quite unique content and, in many cases, totally different styles of measuring attainment.

With this in mind it has been a massive task to try to maintain parity and show consistency between all the subjects we offer in key stage 3, some of which were not even assessed individually under the old national curriculum levels system and so were not as used to the more formal requirements of reported teacher assessment. While it is almost impossible to eradicate completely, variation and inconsistency should be continuously challenged until it is reduced to a level which is acceptable to your school at their current stage of implementation.

If you are planning to introduce a system to all years in key stage 3 at the same time the task is made exponentially harder by the need to design subject tests for assessing both a year 8 and a year 9 baseline, the need to include aptitude tests which adapt to the age of the pupil at time of sitting, and the increasing timespan from the point of origin for the key stage 2 data. It was for these reasons that we have made the decision to have a three-year rollout period.

  • Martin Smith is assistant headteacher at Darrick Wood School in Orpington, Kent.


Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin