Our three-year study of AI use by Australian schoolteachers is throwing up some very interesting use cases, such as one school principal using AI to produce ‘personalised’ video greetings for each of his school’s new Year 7 students
It is now relatively straightforward to use a ‘text-to-video AI generator’ to generate clips of yourself ‘talking’. First, you need to upload a few hours of videos of you actually speaking to camera. After this, the software can then generate fairly realistic AI generated videos of you appearing to ‘say’ new lines of text that you input.
This has obvious appeal for people whose jobs involve directly addressing an audience – including teachers, lecturers, and this school principal wanting to make a connection with a new intake of students.
Whether or not you see such AI puppetry as a ‘smart’ or ‘creepy’ use of technology very much depends on how you see schools and the role of the school principal.
On one hand, your immediate reaction might be of disdain. This could be seen as an utterly inappropriate use of AI that sends out a hollow message to a new and vulnerable cohort of students. This use of AI could be said to convey inauthenticity, and a lack of general care and respect. As the teacher who relayed this example to us reasoned: “I think that’s horrifying. Because it’s the relational part [of teaching]. You don’t fake a relationship” [Tim, Brookdale High School, 18.06.24]. If nothing else, this use of AI does not seem to sit well with ongoing concerns over student use of ‘deep fake’ tools to fabricate videos of their teachers.
On the other hand, the above example could equally be seen as a smart use of AI – allowing this school principal to do something that would otherwise have taken him a couple of days (or more) to achieve.
These advantages do not just relate to timesaving. In addition, rather than send a generic email, this use of AI allows the principal to produce content that is likely to resonate with the new students – giving them a sense of who he is, what he looks like, and projecting a welcoming demeanour from an otherwise fairly remote authority figure in the school.
AI also ensures also that each video shows the principal looking fresh (rather than exhausted), speaking clearly and sounding on top of his game. Thirteen-year-old students are far more likely to watch a short 30 second video than open an email or read a letter. Also, how less authentic is this type of mass ‘personalisation’ than an email sent with a Dear <INSERT FIRSTNAME> introduction, or a computer-produced handwritten card?
As the same teacher who recounted this use case to us also reflected:
“That said, I produce a whole bunch of fairly crappy videos, that they’re effective enough, but you know, they could be better, I would love to be able to feed my videos into an AI engine and shake them up a bit, make them more polished” [Tim, Brookdale High School, 18.06.24]
In our research we need to be careful to not jump to immediate conclusions about the ‘rights’ and ‘wrongs’ of any particular use of AI. Teachers are professionals that make decisions all the time whether to do (or not do) things with AI tools. Our main interest lies in unpacking everything that happens when teachers’ work is augmented by (or perhaps outsourced to) AI. As such, this particular use case raises a host of further questions …
- Why does this use of AI ‘work’ for this principal? Why would a principal see this as a viable use of AI?
- What ‘gains’ and ‘efficiencies’ are perceived? Who is gaining and, as a result, who else might be losing out?
- What are the consequences of this AI task being carried out – what are the immediate and longer-term ramifications? For example, how did different Year 7 students actually feel about receiving these videos?
- How could this task have been carried out *without* using AI, and what do the perceived inconveniences of *not* using AI to carry out this task this tell us about the nature of principals’ work?
- What would have happened if this use of AI had not occurred, and the task not been carried out at all?
- Are there variations on this task that the principal would see as inappropriate (e.g. making a welcome message for all parents, or new staff)?
- What would a school look like if this use of AI was the norm?
- What does this tell us about the wider systems and contexts that surround the principal and this AI software – e.g. the school as workplace, the wider education system, local community, the Australian EdTech marketplace?