Is PISA irrelevant because students don’t try?
The relative importance of effort and knowledge
A recent story in The Educator Australia by Brett Henebery caught my eye. It referred to a new analysis of data from the Programme for International Student Assessment (PISA). PISA assesses the reading, maths and science performance of 15-year-old students from a range of countries every three years. The headline figure was that 73% of Australian students, “indicated that they would have invested more effort if the PISA test counted towards their marks.”
To Trevor Cobbold of the pressure group, Save Our Schools, this was a ‘remarkable revelation’.
“These results suggest that PISA is not the accurate, reliable, and valid measure of educational quality it claims,” Cobbold concluded.
Cobbold went on to link this finding to NAPLAN, the series of standardised English and maths assessments taken by Australian students in Years 3, 5, 7 and 9. These assessments can be high-stakes for schools but are not qualifications that students take with them when they leave and so are intrinsically low-stakes for these students unless the adults around them heap on the pressure. So, you can see where he is going with this. Cobbold and Save Our Schools have a long track record of opposing standardised testing and he thinks this figure supports their case.
It was left to Dr Sue Thomson of ACER, the organisation who compiled the report the figures are drawn from, to point out that teachers would hardly find this a ‘remarkable revelation’ and to inject some basic science. Thomson noted that if engagement with the test was very low, we could expect large sections to be left blank by students. However, “The data shows that there were very few students who did not respond to all questions.”
As a teacher, I use low-stakes quizzing all of the time. For instance, students in my maths classes spend a significant proportion of the lesson completing problems on their mini-whiteboards which they hold up for me to review. The idea that I cannot gain any valid or reliable information from formative assessment of this kind because the students are not investing as much effort as they would if it counted towards their marks is bizarre.
Significantly, you cannot have it both ways. You cannot, argue that standardised tests are evil because they heap pressure on students and then go on to claim the results of them are invalid because students don’t put in enough effort. Which is it?
If you read the actual report from ACER, it contains key information that missing from the news article - the statistics for other countries. What can we possibly make of the 73% figure unless we know how it compares with other states?
In fact, the figure for Australia is virtually the same as the 74% of students from Singapore who would have invested more effort in PISA if it counted towards their grades. Singapore is different from Australia in many ways that could affect its performance in PISA, but not this one. Whatever it is that is causing Singaporean students to significantly outperform Australian students, it does not appear to be effort on the PISA test.
Interestingly, Finland’s figure is 70% i.e. between the OECD average of 68% and Australia’s figure of 73%. If the difference between Australia and Finland is statistically significant, which I doubt, it suggests Finnish students put in more effort than Australian ones. I only note this because I have heard it argued that Finland’s precipitous decline in performance on PISA since 2006 is due to a concurrent drop in effort on the tests, a suggestion I have attempted to address before. Although this new data does not show how Finnish students’ effort has changed over time, the fact that in 2018, it was consistent with other major economies gives cause to doubt this explanation.
Effort cannot make-up for a lack of knowledge. Motivation cannot make you know things you otherwise would not have known. Conversely, when you know the right answer to a question, why would you not offer it, even if you are not particularly fussed about the results of the assessment? I suspect some students give silly and perverse answers for the fun of it, but it’s the average of large numbers of students that matters in studies of this kind.
If you are concerned about the validity of PISA, there are better areas to target. PISA seems to have been designed under the misapprehension that knowledge is not really that important and we can conceive of reading, maths and science as a set of ‘literacies’. This makes the questions wordy and confected.
Nevertheless, regardless of the design of PISA questions, the results are consistent with findings from across a range of different forms of education research. Despite a lot of noise about poorly defined ‘memorisation strategies’, the clearest signal in the data is that fashionable inquiry learning approaches are associated with worse performance.
Effort matters, but knowledge, and therefore the most effective methods for imparting that knowledge, appear to matter more.