Improving Australia's standardised assessments

Are the proposed changes to NAPLAN a step in the right direction?


The National Assessment Programme - Literacy and Numeracy (NAPLAN) is a set of standardised assessments sat by Australian students in Years 3, 5, 7 and 9. They were scrapped in 2020 due to the COVID pandemic but have been reinstated this year and this has led to a moment of reflection. Adam Carey in The Age and Rebecca Urban in The Australian are now reporting changes that have been agreed to the programme by state and federal education ministers.

Many interest groups oppose NAPLAN, not least the Australian Education Union, whose contribution to the report in The Age is both worrying and unintentionally amusing.

However, if we set aside those who are implacably and ideologically opposed to standardised testing and take a more rational view, are the proposed changes positive or negative?

Certainly, moving the tests to the start of the school year, and reducing the time between them and the reporting of results from three months to two weeks, are both positive. It’s not particularly helpful to schools to know what their students were capable of three months ago, even if the comparison to similar schools and the progress measures are still valid. The start of the year is neater, but the end of the year would be even better because results would give more direct feedback to the teacher who had taught the students in the run-up to the assessment.

What of the plan to allow schools to opt in to assessments of digital literacy, scientific literacy and critical and creative thinking? Well, this is probably a waste of time and effort. Critical and creative thinking are not separate generic skills but are made up of lots of subject-specific bits of knowledge and skill. As such, these assessments will likely track performance in the literacy and numeracy strands of NAPLAN - as has happened with PISA’s attempts to assess similar constructs - but maybe it’s worth finding out. ‘Scientific literacy’ is what you assess when you don’t believe in the value of scientific knowledge. It’s what the science strand of PISA tests and it tends to involve wordy questions involving graphs and diagrams. It is therefore no surprise that it tracks literacy and numeracy pretty well too. A genuine assessment of scientific knowledge and understanding would be of far more use.

I have to admit that I hadn’t realised that ‘digital literacy’ was still in fashion. I suspect this may involve something like spotting fake news. If so, it is likely to track reading comprehension. If it’s more of a test of digital skills - using a wordprocessor etc - then it’s retro and kitsch.

What else?

There are two obvious and overdue improvements needed to NAPLAN. Firstly, the numeracy assessments need more non-calculator questions in order to encourage basic mathematical fluency. This requirement was drastically cut at Years 7 and 9 in 2017. These cuts must be reversed.

Secondly, the reading and writing tests must remove their inbuilt privilege bonus. Instead of being set in random contexts - contexts that therefore benefit students who have travelled, visited museums, had long chats with their parents at dinner and so on - they should be set in the context of the previous year’s Australian Curriculum. Yes, the Australian Curriculum is woefully thin, but there is still enough in there from which to select a few reading and writing contexts. And using these contexts would give schools some actual agency over the results (see e.g. Louisiana’s approach).

This does not mean, as has been suggested to me, that NAPLAN would have to inform schools in advance of the precise contexts used and that this would then distort the curriculum. I don’t know why anyone would interpret this call in this way but, for some reason, they do. Schools just need to know that contexts will be sampled from the previous year’s curriculum in its entirety, which would have the added bonus of encouraging schools to teach this curriculum and not just focus on endless reading comprehension drills.

I have also come to the view that NAPLAN should include a short wellbeing survey, asking students about their experiences at school, their experiences of adverse classroom behaviour, bullying and so on. Ideally, this would be complemented by surveys of parents and teachers. However, I don’t think such survey results should be reported in the same way as NAPLAN results. Instead, this data could be used for addressing research questions: Is there a trade-off between academic performance and wellbeing as some have suggested (I predict not)? Which schools are best at managing classroom behaviour and what do they have in common?

Given the dire state of classroom behaviour in Australia compared to other countries and its likely impact on student wellbeing, the answers to this last question are urgently required.