Earlier this week I was invited to take part in a round table discussion hosted by TES and Jisc at the University of Glasgow. This was a very small event, there were 10 of us and it was a sort of pre-conference event before the official start of the THE Teaching Excellence Summit
I’m not quite sure how I got invited but I suspect it was a combination of being a ‘kent’ face to Jisc, being local and being female. It probably comes as no surprise that I was one of only two women in the group, and that everyone in the room was over 40 and white . . .
It was an interesting, free flowing discussion which is being written up for a future TES article. Nothing earth shatteringly new discussed. We made no major breakthroughs. I may have ranted a bit . . . well I kind of felt I had to as I was the only person at the table from a university lower than DPVC level.
We didn’t really talk about AI as such that much but we did talk about data; how to get it, how to use it, who owns it, and a little bit about what it could do for learning and teaching. This inevitably brought us to retention, predictive analytics and ethics. It was heartening, dear reader, to see the agreement in the room about just exactly how data can help or not with this, and the need for much more research around what data is actually meaningful to collect and how to then make (and resource) effective interventions. I also made sure I got in the point that data not being neutral and the bias inherent in AI. I possibly couldn’t resist using the analogy of the make up of the group sitting at the table having the discussion. . .
We did have quite a bit of discussion about the role of edtech companies, the seemingly never ending issues of (lack of) interoperability in university systems, and just what is it we are trying to do with data. Nothing particularly new really seemed to be the consensus. But still we are being told that the “business” of education must be able to be improved with data, AI, machine learning. I may have ranted a bit more about the current political climate, the danger of the promise of “personalisation” and the reality of increasing homogenization.
There was a throw away remark about “feeding the beast” in relation to all the data/data exhausts that could be “consumed” and “industrialised”. At that point, David Bowie popped into my head. Well not literally but the lines “scary monsters, super creeps” , except this time I was changing the words in my head to “data monsters, (neoliberal) super creeps.”
I do think there is potential for data and some elements of AI within education and wider society. I also think that just now I think it’s really, really important that we in education are leading critically informed discussions with our students, the rest of society about how “it” all actually works, who is in control, who is programming the AI , who owns data. If we don’t do it, then we will just be consumed by the edtech super creeps. They will inevitably sell our data back to us, in workflows they think are appropriate for an efficient (ergo effective), dashboarded to the max student journey but actually might not be that great a learning or teaching experience.
I didn’t take that many notes, and I’m looking forward to reading the TES article if/when it appears and I can maybe write a more considered reflection then. In the meantime I’ll leave you with a bit of David’s scary monsters.