Best Practice

PISA: Beyond the league tables and the headlines

The main findings and rankings of the international PISA research make newspaper headlines every three years, but there is so much more to learn if we take the time to look at the results in more depth. Steen Videbaek offers us his analysis

Last month saw the release of the 2018 results from the Programme for International Student Assessment. PISA is a large-scale international assessment involving 600,000 pupils from 79 jurisdictions from the OECD (and beyond), including 13,668 pupils from 459 school across the UK (OECD, 2019).

Understandably, the initial PISA media attention focused on the headline numbers – whether they went up or down, where they ranked and how they compared to the OECD average.

But beyond these headlines lies a rich information source waiting to be explored.

Reading between the lines – achievement gaps and changes over time

Take reading, the major focus of the 2018 cycle. A cursory look at the average scores sees that England, Scotland and Northern Ireland all scored significantly higher than the OECD average, with Wales performing similarly to OECD average.

Viewing the results over time shows that only Scotland has changed significantly since 2015 (though only to return to the level achieved in 2012). All other reading scores have (statistically) flatlined, even going back as far as 2006.

Behind the average reading scores is important information about how different students are performing. It is interesting to look at three areas – achievement, disadvantage and gender.

The achievement gap, which compares the achievement of the top and bottom 10 per cent, was largest in England and smallest in Scotland.

The disadvantage gap, which compares the achievement gap between the least and most (socio-economically) disadvantaged pupils, was smaller in Scotland, Wales and Northern Ireland than the OECD average. England was similar to the OECD average.

There was also a gender gap. In all countries in the UK, girls significantly outperformed boys in reading, a result mirrored across the OECD.

The reading proficiency levels data provides a further breakdown of each country’s performance. For example, England had 12 per cent of pupils working at the higher proficiency levels (Levels 5 and 6) and 17 per cent of pupils working at the lower proficiency levels (below Level 2). These ratios have not changed significantly since 2015. These results are also presented by cognitive processes. Overall, pupils in England performed better in “evaluating and reflecting” and “locating information” than in “understanding”.

Negative attitudes to reading

Often surprising, and sometimes concerning, contextual information is also presented. The pupils’ responses to questions about their reading activities and their attitudes to reading are sure to pique the interest of anyone interested in reading and literacy.

For example, when asked about reading engagement, more than half of pupils in England agreed or strongly agreed with the statement “I read only if I have to”. Similarly, almost a third said: “For me reading is a waste of time.”

The responses also provide a glimpse into the classroom from a student perspective. For example, a third of pupils in England reported that in most or every English lesson there is “noise and disorder”, with 30 per cent saying that “students don’t listen to what the teacher says”.

There are also fascinating insights into pupils’ reading practices. Pupils were asked if they read certain types of material at least several times a month. Interestingly, reading comic books is less popular in England compared with the OECD average (eight per cent versus 15 per cent). Also, the popularity of magazines and newspapers has dropped in England, a trend seen across the OECD.

In 2009, newspapers and magazines were each read by 60 per cent of pupils in England. In 2018, this has dropped to 18 per cent and 10 per cent respectively.

External follow-up research of these types of questions will often examine the relationship between high PISA reading scores and the type of text pupils read. Of course, any conclusions from this type of further analysis needs to be taken with great care because of the difficulties of distinguishing correlation versus causality. For example, do certain types of text make pupils better readers or do pupils who are better at reading choose to read different types of texts?

Another interesting, albeit slightly quirky reading question presents a scenario where the pupil has received an unsolicited email saying they have won a smartphone. It then asks pupils about the appropriateness of different strategies. The results are both reassuring and slightly worrying at the same time.

The good news is that pupils in England were more likely to respond appropriately compared to their OECD counterparts. Conversely, the poor strategies “click on the link to fill out the form as soon as possible” and “answer the email and ask for more information about the smartphone” were rated (somewhat highly) at around 2 to 2.5 on a 0 to 6 scale (1 being “not appropriate” and 6 being “very appropriate”).

These types of results are ripe for further analysis.

Concerns about wellbeing

PISA is not just about the core domains of reading, mathematics and science. It also includes one-off innovative domains (for example,

PISA 2018 explored global competence), as well as student wellbeing and contextual questionnaires that provide interesting insights into the lives of students.

For example, when asked about their experiences with bullying,
seven per cent of pupils in England reported that they had “been threatened by other students” (a few times a month to once a week). One in 10 reported that “other students spread nasty rumours” about them.

Pupils were also asked about their satisfaction with their life, to what extent their life has meaning, and how often they felt a range of positive and negative feelings. In England, 93 per cent of pupils felt happy sometimes or always. However, just 56 per cent agreed or strongly agreed that their life had clear meaning or purpose, and a higher proportion reported sometimes or always feeling worried, miserable or sad compared to the OECD average.

What's next?

Over the following months researchers will unpick the wealth of information that PISA provides. Equality of outcomes will be examined and high (and low) performing school systems analysed. All this will hopefully drive a rich evidence-based debate that goes beyond the headline figures.

And after that: PISA 2021!

PISA happens every three years, so another one is just around the corner. There are a number of important changes in store for PISA 2021.

Mathematics is the major focus and PISA 2021 will use a new framework that focuses on mathematical literacy – using maths to solve problems in a variety of real-world contexts.

PISA 2021 will also see the introduction of a creative thinking assessment innovative domain, which will attempt to measure the ability of students to “think outside the box”.

  • Steen Videbaek is senior economist at the National Foundation for Educational Research.

Further information & research

  • PISA is the OECD’s Programme for International Student Assessment. PISA measures 15-year-olds’ ability to use their reading, mathematics and science knowledge and skills to meet real-life challenges. For the 2018 results, which were published in December, visit www.oecd.org/pisa/
  • NFER is running PISA 2021 in Scotland. So, if you are a secondary school in Scotland and would like to opportunity for your pupils to take part in this global study, visit www.nfer.ac.uk/international/international-comparisons/pisa/for-schools/

NFER Research Insights

This article was published as part of SecEd’s NFER Research Insights series. A free pdf of the latest Research Insights best practice and advisory articles can be downloaded from the Knowledge Bank section of the SecEd website: http://www.sec-ed.co.uk/knowledge-bank/