Maths education experts on a mission
Strange ideas about evidence and scholarship
The Association of Mathematics Education Teachers (AMET) is a UK body representing people who train teachers in how to teach maths. Yes, I know that does not sound particularly promising.
Anyway, AMET have submitted a complaint to Ofsted, the agency that regulates government schools in England. Earlier this year, Ofsted released an excellent report on the factors that influence the quality of mathematics education and AMET are not happy about it.
The complaint is oddly worded, with AMET at pains to stress its academic credentials. They want us to know that, “As many of our members are involved in research, as well as setting and marking academic assignments, we take scholarship very seriously.” Well, indeed. Don’t we all?
AMET claim to have unearthed a whole heap of referencing errors where a point made in the Ofsted report is not supported by the reference given.
“We emphasise that this complaint is not about the content of the report but on the poor match between the content and the references which are used to support it.”
Now, consider the curious case of Footnote 166:
“We welcome the fact that Ofsted have included a wide range of types of research, rather than looking exclusively at Randomised Control Trials. However, this means that the generalisability of the source must be considered. Footnote 166 consisted of a single source that involved research with four children in the United States. This is not a secure research base for making a general point for UK education.”
Something strange has happened. In this case, rather than claiming that the reference does not support the content, AMET are taking issue with the quality of the research described in the reference. The reference could be a perfect match for the point made in the report and so AMET has now shifted their argument.
What is the point made in the report?
“Pupils are more likely to engage in disruptive behaviours if they are expected to complete tasks that they have not mastered the component parts of yet. They are more likely to stay on task and be motivated if tasks are achievable.”
Whether supported by a small study or a large one, I don’t find this statement particularly controversial.
So far, I have dealt with the strongest elements of the complaint. What, for instance, are we to make of the suggestion that we should dispassionately evaluate evidence using logical fallacies?
“A key element of a research review is criticality, considering when, where, how, why and by whom the source was created.” [my emphasis]
It doesn’t matter who did the research. It matters whether they are right or wrong.
And what about the eccentric suggestion that research evidence should be drawn from recent sources:
“The age of the sources is sometimes a concern. The report includes research ranging from an article published this year to an article from 1980, with a book from 1939 used to illustrate a historical point. Approximately 50% of the references come from sources published within the last 10 years.”
Well I never! Fancy using a book from 1939 to illustrate a historical point! What’s wrong with these crazy Ofsted people?
Apply this principle to the hard sciences and we would have to throw out Newton, Einstein, evolution and the periodic table. Apply it to the substance of maths and most of what we teach in school would be gone, including algebra. Commenting on the age of the research is a curious argument that for some reason seems to be a trope of people who work in teacher training. Is there some feature of education schools that causes people to reason this way? Is it because they are beholden to fashion rather than evidence?
Nevertheless, the oddest claim is that we should discard research evidence from the US because the US doesn’t do very well compared to other countries on international tests such as PISA and TIMSS:
“More than 40 of the citations related to East Asian countries. This seems reasonable since these countries have performed consistently well in mathematics in international studies such as TIMSS and PISA. However, it was surprising that more than three times this number of references came from the USA. The United States has not been a consistently high performing country in these international comparisons; in fact we outperformed them in the recent PISA (USA 478, UK 502) and TIMSS at Grade 4 (USA 535, England 556), although we had the same score at TIMSS Grade 8 (USA 515, England 515).”
So, AMET found it surprising that in a report written in English, a large number of references came from the US, an English-speaking developed country with a large population and around 4000 universities. Singapore does well on PISA but it only has one school of education so its output of education research will necessarily be limited.
And what’s going on with the logic here? If I want to test two approaches to teaching a mathematical concept, for instance, in order to see which is more effective, why would it matter whether I did that in a country higher or lower in the PISA rankings? At least the US has some cultural similarities with the UK. I suspect that if all the research Ofsted used had been drawn from Singapore and Hong Kong, AMET would have complained about how culturally different those states are to the UK and how we should not assume the findings would transfer to the UK context.
AMET suggest renaming the Ofsted report as a ‘position paper’, implying that the report was written to buttress a previously adopted position rather than as a dispassionate review of the evidence. However, it is AMET’s complaint that reads as if they had a brainstorming session to come up with anything and everything they could throw at the Ofsted report, no matter how absurd, and that it is they who have an agenda to prosecute.