# Peter Sullivan and the draft mathematics curriculum

### Why I agree with Sullivan (up to a point)

I was recently interviewed by *The Guardian* for a piece on stalled revisions to the Australian Curriculum: Mathematics. The story so far is that when the draft revisions were published last year, they were deeply concerning, to the extent that I was involved in the writing of an open letter signed by hundreds of teachers, mathematicians, researchers and parents where we expressed these concerns. Key objectives, such as learning times-tables or linear algebra, were going to be delayed and the whole framing of the curriculum promoted the teaching method of students learning maths *through* solving problems. Although I understand that the content delays have been reversed and that the problem-solving has been toned-down, I also understand that the revised draft is still stuffed with elaborations that describe inquiry-style activities.

If I were drafting the curriculum, I would take out all of these elaborations and focus on the content we want students to know. This would include statements making explicit the need for students to master times-tables, to be able to perform basic operations without a calculator, and to become fluent with standard algorithms for addition, subtraction, multiplication and division, as in the Singaporean curriculum, a curriculum that ACARA — the body in charge — researched when preparing their draft. They somehow missed these bits.

In the story in *The Guardian*, the journalist, Donna Lu, also talks to Peter Sullivan, emeritus professor of STEM* education at Monash University. In his comments, Sullivan makes a point that I strongly endorse. He suggests:

*“The mainstream view among mathematics educators is that student-centred structured inquiry helps to develop in students their agency, in terms of thinking for themselves rather than following recipes.”*

A completely agree with Sullivan that this is the ‘mainstream view’, at least among education academics. This is why graduates of Initial Teacher Education in Australia leave their courses with the impression that explicit teaching is bad and inquiry learning is good.

Sullivan goes on to explain that students should choose their own ways of adding 132 and 99 including, presumably, all of the inefficient and confusing ones. This is a recipe for disaster. We know that this kind of discovery learning is less effective than explicit teaching and it is also likely to lead to a divergence in outcomes between the privileged and the vulnerable. While the most advanced students second-guess and recapitulate versions of well-known standard methods, less advanced students will be left to struggle and fail.

I find it interesting that Sullivan is now calling discovery learning ‘student-centered structured inquiry’. The term ‘student-centered’ was adopted years ago by advocates of ineffective discovery learning approaches, presumably because it sounds caring and positive. I haven’t seen ‘structured’ crop up before and I wonder whether this is a belated response to criticisms that discovery learning does not provide sufficient guidance. It’s a funny kind of ‘structured’ learning that asks students to choose their own methods for solving basic problems. But here we are. Always the language games.

Sullivan points to the 2012 PISA results as evidence for his position. He suggests that what PISA terms ‘cognitive activation’ strategies are associated with improved PISA performance. These come from a survey asking students how often their maths teacher does the following:

the teacher asks questions that make students reflect on the problem

the teacher gives problems that require students to think for an extended time

the teacher asks students to decide, on their own, procedures for solving complex problems

the teacher presents problems in different contexts so that students know whether they have understood the concepts

the teacher helps students to learn from mistakes they have made

the teacher asks students to explain how they solved a problem

the teacher presents problems that require students to apply what they have learned in new contexts

the teacher gives problems that can be solved in different ways

This is an odd bunch of questions that mush together a number of quite different ideas. Some, I recognise as part of my everyday maths teaching. Others would be things I would introduce when students have mastered basic concepts — such as the introduction of different contexts or demonstrating different methods for solving the same problem. The third question seems the most relevant to Sullivan’s argument, although I wouldn’t describe 132 + 99 as a ‘complex’ problem — note that the students answering these questions are 15-years-old and are unlikely to still be working on number operations of this kind. From the answers to these questions, PISA construct an index that they then attempt to correlate with PISA maths scores.

This data has been further analysed by Caro and Lenkeit in a 2015 paper. I don’t think I should publish their diagrams without permission, but if you follow the link and turn to page 13 of the paper, you will see exactly what the extent of the positive association between cognitive activation and PISA maths performance looks like.

This is how they summarise it in the abstract:

*“The results provide consistent evidence of a positive curvilinear relationship between cognitive activation strategies and mathematics performance. The association tends to be stronger in schools with a positive disciplinary climate and for students from advantaged socio-economic backgrounds.”*

PISA also created two other indexes based on survey questions — ‘teacher directed’ and ‘student oriented’. Again, from the abstract:

*“Teacher directed instruction is positively related to mathematics performance, particularly for students from disadvantaged backgrounds, but the association tends to become negative for high levels of teacher directed instruction. Associations of student oriented instruction practices with mathematics performance are inconsistent.”*

So, teacher directed instruction is good, particularly for disadvantaged students, but becomes bad at high levels. You can find the graphs in the appendices on page 85 and they look similar to the cognitive activation ones.

I would challenge Caro and Lenkeit’s statement about student oriented instruction. The graphs for this are on page 86 and show a negative relationship between student oriented instruction and maths performance in *all* of the 62 countries surveyed — the greater the amount of student-oriented instruction, the worse the maths scores. That is nothing if not consistent.

What are the questions that make up the index of student oriented instruction? Well, again they are a bit of a mess but if anything, they seem to align with forms of discovery learning:

the teacher gives students different work to classmates who have difficulties learning and/or to those who can advance faster

the teacher assigns projects that require at least one week to complete

the teacher has students work in small groups to come up with a joint solution to a problem or task

the teacher asks students to help plan classroom activities or topics

The PISA evidence is a correlation and so we cannot be certain about cause and effect. It’s possible — plausible — that teachers tend to choose to use ‘student oriented’ activities with students who struggle with maths and they tend to choose to use ‘cognitive activation’ with students who are advanced mathematically. If so, the level of maths performance would be the cause and the teaching methods, the effect.

Nevertheless, it is interesting that an advocate of discovery learning would point to the piece of data on ‘cognitive activation’ and ignore parallel data collected and analysed alongside it. If there were better evidence to support inquiry learning, I suspect Sullivan would have gone with that.

The fact is that education professors have been advocating for discovery learning since the invention of education professors. And even before that, philosophers were doing the same. This is not an evidence-based position, it is an ideological one based upon romantic — and historically often racist — notions of natural learning. That’s why basic cognitive science, such as the limited capacity of working memory that make discovery learning so ineffective for novices, is ignored.

And it is why maths teachers will need to continue to make the case for explicit teaching. If we don’t, we will find inquiry methods imposed on us, as was attempted with the draft curriculum.

**I am not a fan of the term, ‘STEM’, but that is for a different post *

The only thing worse than STEM is STEAM (https://www.sciencedirect.com/science/article/pii/S1877050913011162)

Thank you, Greg, great piece! I am crunching TIMSS data for New Zealand now and hope to report the results soon - maybe at the 44th MERGA conference. Unsurprisingly, it looks very similar to the PISA results you cited but from a slightly different angle. Thanks again - very useful, indeed!