The UK government has decided that the white heat of technology is the road to improvement and is backing Artificial Intelligence as the solution to many problems. This includes an education action plan.
Perhaps contrary to expectations, I am cautiously optimistic about what AI has to offer teachers and students. However, this comes with a few caveats. The first of those is perhaps encapsulated by what we might call ‘the Suno problem’. Michael Pershan on Twitter/X prompted me to go back and visit this app which I first had a look at when it was a social media phenomenon in April last year. Given a text prompt, it generates a piece of pop music.
Here’s my attempt, Dudley’s Silent Tracks, to get it to generate a piece of English folk music with a female singer about the fact that my home town of Dudley has not had a railway station since the 1960s.
If you listen to it, you will notice it does not use a female voice and it is not English folk music—it has a decidedly American flavour. This illustrates that AI is not yet even close to perfect. However, even if we trained it on the entire catalogue of Kate Rusby so it could do a better job, I don’t think it is ever going to reach the level of art. The lyrics are banal and the music is generic and derivative.
However, if you want something generic and derivative, AI can help. If you want a tool that, with varying degrees of accuracy, reflects the majority of sources, AI can be a useful tool. Critically, it can save a lot of time.
One use for AI is in planning lessons. No, asking it to produce a lesson plan for an introductory algebra lesson will not generate anything of much use. We need to be more specific and I think this is a key principle. There is a sweet spot of specificity that we need to reach when prompting AI. We need to be specific enough to produce something useful without becoming so bogged-down in specificity that we lose any efficiency gains.
One way I have found that AI can help is in generating questions with worked solutions. However, it is best to give it a model to work from—‘Write ten questions similar to this and give worked solutions’—and also give some guidance on what you want to vary between questions. For a very basic example, if I ask chat GPT to generate questions similar to, ‘3x + 2 = x + 10, solve for x’, it will give a load of questions that all use the pronumeral, x, and have positive whole-number answers. If that’s not what I want, I need to give it more guidance.
This means we cannot approach this task clueless. You really need to have a very clear idea of what you want the AI to achieve. As with much technology, those who approach it with the most pre-existing knowledge have the most to gain.
This post is free to read and share so please do so widely. However, if you are not already a paid subscriber, you should become one. For $5 AUD per month or $50 AUD per year, you will have access to all the archives and the weekly Curios posts that summarise everything new and interesting in the international world of education. You would struggle to buy a decent coffee in Ballarat for $5.
This was evident to me last year when I taught Economics, a subject I have no background in. I did use AI but I had to work hard to monitor its errors and be clear with exactly what I wanted from it. This involved a lot of cross-checking with a textbook. At one point, when the AI and I disagreed, I found myself seeking out an experienced economics teacher to adjudicate.
This is annoying, but not fatal. In the hands of a subject expert, AI can speed up the process of planning.
The UK government announcement also envisages teachers using AI for marking and feedback. Again, this does have potential. I am no longer convinced of the model of teaching writing that sees students write long essays full of errors and misinterpretations that the teacher then ‘marks’ by annotating to varying degrees, writing three points to work on at the end and, depending on how traditional they are, assigning a grade.
It all seems a little pointless. The comments are often similar across students because they will reflect some issue with the initial teaching and unless the students are required to respond to the comments by redrafting—and not by writing their own comment in response which is truly the daftest thing I have ever heard of in a sector famed for its daftness—the chances are that hours of teacher work will simply be ignored.
I prefer lots of formative assessment on the journey leading up to the writing of an essay, including sentence and then paragraph work, and whole-class feedback sessions.
However, if you do want to provide individual feedback on each essay then AI is a far more efficient way to do this. We can obtain the possibly small learning gains with a fraction of the effort. Moreover, AI can also offer suggestions about what the whole-class feedback should focus on.
Which students do you think are most likely to read and take on board this individual feedback? I predict it is those who are already achieving highly and are motivated. And this highlights a perennial problem with technology.
The Matthew Effect is named after an enigmatic passage in the Gospel of Matthew (25:29) in a section of the parable of the talents. It states:
“For to everyone who has, more will be given, and he will have an abundance. But from the one who has not, even what he has will be taken away.”
The Matthew Effect therefore is an effect that magnifies the advantages of the already advantaged while offering nothing to the disadvantaged.
Technology often acts in this way. When we think about the recent pandemic, who do we imagine were most engaged with online learning? The already advantaged. I predict AI tools will act in a similar way because they not only require motivation to engage with, they need to be approached with a sufficient level of prior knowledge in order to understand the feedback and notice mistakes or incongruities.
One use you will have seen me make of AI is generating images. Again, these share many of the issues Suno has, but they do have one advantage.
Below is an image I generated for a talk I am giving to teachers next week. The image is from a section where I discuss the idea of the homunculus paradox. A ‘homunculus’ is the idea of a little person sitting inside our heads, controlling what we do and I will argue that mistakenly, we are often trying to train this nonexistent homunculus when we try to teach generic thinking skills.
Note that the slide contains no writing. I will have my own notes to hand to go along with this slide. That’s because I don’t want to put the audience in the position of having to decide whether to listen to me or to look at the slide. By sharing an image instead, I can make use of the fact that working memory appears to have different channels for processing verbal and visual information and so the image is unlikely to interfere with what I am trying to explain.
It would be very difficult to do this without AI. I would have to hope someone had created the kind of image I was looking for or attempt to badly create it myself. Even then, the image would likely have copyright restrictions. I am highly aware of this because, prior to AI, I have spent many hours searching for images to illustrate my blogs. Now, with a little bit of trial-and-error and notable cases where I simply cannot get AI to generate what I need, I have a tool that I can use.
I have focused on some practical and specific applications of AI because that’s where I think it is right now. There are likely to be many more. Fanciful talk of humanlike AI tutors who are capable of replacing teachers is just fanciful talk at this stage. That day may come but for now, the best use of AI will be made by knowledgeable and experienced subject teaching experts.
"The lyrics are banal and the music is generic and derivative."
It's destined to be a No. 1 hit!!
I am a high school English teacher. I use an online form to collect students' homework paragraphs in a Google Sheet. I used an LLM to write a script that sends the responses (not the students' names or identifying information of any kind) to ChatGPT for proofreading using a custom prompt that:
1. Highlights spelling, punctuation, and grammatical errors.
2. Adds a numbered list of each error with the correction and a URL to a reputable website where students can read more about the error and how to fix it.
3. Adds a checklist to aid formative assessment, saving me from writing the same comments repeatedly.
The script then exports each proofread response to a Google Doc for printing. I then provide handwritten feedback on the content of each response. I have not actually used this workflow in class yet. I created it as a proof of concept over the school holidays.
I have also used ChatGPT to create custom GPTs and projects to speed up the creation of teaching resources. The outputs are not perfect, but they save me a lot of time.