Assessing computing: Practical approaches

Written by: Terry Freedman | Published:
Image: iStock

Computing is slowly becoming an established part of the curriculum, but many schools are still finding their feet – especially when it comes to assessment. Expert Terry Freedman discusses some practical approaches to effective assessment in computing

How do you assess computing? This is a question that has exercised a great many people for years – decades, in fact.

The programme of study for computing states the following in its introduction: “The core of computing is computer science, in which pupils are taught the principles of information and computation, how digital systems work and how to put this knowledge to use through programming

“Building on this knowledge and understanding, pupils are equipped to use information technology to create programs, systems and a range of content.”

In making this statement, the Department for Education (DfE) has gone old-school. What that really boils down to is a declaration that there are particular things that pupils must know, understand and be able to do.

This, then, is where we should start, because the whole purpose of assessment is to find out what the student knows, understands and can do. In practical terms we can think in terms of three broad areas:

  • Lower order thinking skills (or LOTS).
  • Higher order thinking skills (or HOTS).
  • Skills.

This roughly corresponds to “know, understand and can do” – though it is not entirely satisfactory.
LOTS and HOTS are implicitly based on Bloom’s Taxonomy as usually interpreted – but the original

“Taxonomy of Educational Objectives” did not present the six learning domains (knowledge, comprehension, application, analysis, synthesis and evaluation) as a hierarchy as such. Knowledge was even subdivided into a number of different types of knowledge.

As an example, knowledge, or remembering, as the later revised version of Bloom’s has it, is regarded by many teachers as being the lowest rung of the ladder. However, a moment’s reflection will reveal that it is, in fact, a prerequisite for doing anything else.

Take the skill of debugging, for instance. If you think about it, you cannot effectively debug a program unless you have a mental model of how the program is supposed to behave.

In theory, of course, you could try various tweaks at random, and perhaps eventually hit upon one that works, but it is hardly efficient. And if you called technical support and they used that approach, it would hardly inspire your confidence in them.

Getting back to LOTS and HOTS, they are both interdependent and dependent. You need some so-called LOTS in order to practise so-called HOTS, such as in the example of debugging just given.

They are also independent: you can have knowledge without skills, or skills without understanding what you are doing.

For these reasons, it is important in practice to assess not only students’ knowledge, understanding and skills separately, but also together, such as in the form of scenarios.

So much for theory. How can you assess these different dimensions in your classroom?

Assessing LOTS

Ways of assessing LOTS (knowledge, basic understanding and simple problem-solving) include the following:

  • Oral questioning in class.
  • Tests using a student response system or an app that provides similar functionality. A well-regarded one is Socrativ (see further information for all links).
  • Matching exercises (see later).
  • Word puzzles and crosswords (see later).
  • If you use Google Drive, you can set up self-marking tests. See the list of resources at the end of this article for a link to some instructions.
  • If you prefer Excel, then see my link to instructions on how to create a self-marking spreadsheet test.
  • Another approach is to get the students to tell you what they know by showing you how they have solved a problem, or other work they have done. Some useful aids here are Explain Everything and Showbie.

Among these approaches, matching exercises can be quite useful. That is where you have a list of items in a column on the left and a list of items on the right, and the student has to create pairs.

You may have seen this sort of thing done in primary schools, but don’t let that put you off, because you can create quite sophisticated lists. For example, you could have an item like “Give user several choices” on the left, and “IF ELIF ELSE” on the right. A good freeware program is Hot Potatoes, which contains six different kinds of puzzle.

I would also advocate the use of word puzzles and crosswords. Again, before you protest that this is all very low level, it really doesn’t have to be. One of my favourite approaches is to create a word puzzle, but using crossword-style clues rather than the words themselves. While you could save yourself a lot of bother simply by creating a test, the word puzzle approach tends to be more engaging for some students.

A good option is Discovery’s Puzzle Maker, though if your school doesn’t subscribe to Discovery Education you cannot save the puzzles you create. Ten different kinds of puzzle are available on the site.

Another option is the Word Puzzle Generator. This doesn’t allow you to save the puzzle either, and the only puzzle available is the word search, but it offers more sophisticated options.

Assessing HOTS

Ways of assessing HOTS (problem-solving, especially where the problem is open-ended) include the following (bear in mind that in this area it is more time-consuming to both set up the assessment and to mark it – although you could employ peer-assessment to some extent).

  • Have students work on a problem and create an artefact as a basis for talking you through their solution. For example, have them create a wiki, a Slideshare or a video.
  • A challenging form of video creation is to use Vine, because you have only six seconds to tell your story, and then the video starts again. How about asking students to demonstrate the concept of an algorithm by creating a Vine video? There are 22 ideas for using Vine available, listed under “Vine ideas” in the resources section.
  • Another approach is Webquest. This is looking a bit dated now, but is in fact a very good tool. It involves creating an interactive document for resources, and a challenging task for students. The students’ job is to look at the problem, obtain useful information from web-based sources, and come up with a solution. This could be in the form of a written report, a video or, in the case of computing students, a computer program.

You will need a way of assessing how students are doing if you go down the open-ended problem route, and a reasonably good approach here is to use rubrics.

A rubric is a grid containing a series of ready-made judgements, and a corresponding grade. In these post-Level days you may wish to dispense with the grades – and you definitely should dispense with them

if you intend to use a rubric as a tool in peer-assessment (because students don’t tend to give their friends a low grade, and in any case you want them to focus on the quality of the work, not a number).

For example, a judgement for a website creation exercise may include: “Colours seem to be random, serving no particular purpose. Links are not marked as such, and they do not include a title tag.”

An excellent place to start is the (American) Rubistar rubric creation website. You will need to register, but it is worth doing so. There are some templates there which you can customise, but as the American curriculum is so different from our own, and computing isn’t included in the list of subjects, it is probably easier to just start from scratch.

Finally, don’t abandon just plain old questioning. Questioning in class can be very challenging, if you utilise three tools in particular. One is “wait time”, i.e, waiting until someone answers your question rather than providing the answer yourself after a few seconds. Another is what I call “persistent questioning”. Rather than accepting the first answer someone gives, ask “why?” or “please explain that” – and keep on doing so.

Yet another is to ask questions that cannot be easily answered, if they can be answered at all. The Questioning Toolkit is excellent for providing some ideas here.

Assessing Skills

This involves setting a task and then finding a way of recording what the student has done. Some of the apps already mentioned can be employed, such as Explain Everything.

Another approach is to have the students video what they are doing as they are doing it. However, contrary to usual practice, it is better to not ask them to include a running commentary, because it is challenging to carry out a task and give a meaningful explanation at the same time. A better approach is to use stimulated recall. This involves looking at the work, or the video of the work, with the student, and asking him or her to explain why they did X or why they did not do Y.

Recording the evidence

Useful tools for recording achievement are:

  • The SIMS assessment app if your school subscribes to SIMS. This can be adapted for your own assessment approach.
  • Groupcall’s Emerge add-on, if your school uses Groupcall and SIMS. In the context of assessment it provides a handy front end to the SIMS assessment module.
  • The Progression Pathways app is useful for those who have adopted the Progression Pathways approach, but is a subscription-based model.

Referenced resources


Please view our Terms and Conditions before leaving a comment.

Change the CAPTCHA codeSpeak the CAPTCHA code
Sign up SecEd Bulletin