Not quite finding ada: some thoughts on ethics, gender and the humanization of chatbots

Over the past month I’ve spoken at number of events where I have been explicitly calling for the need for more critical and ethical discussions around the use of data and the implementation of any “digital” system, not only in universities but throughout the education sector.  

So I need to walk that walk and not just spurt out rhetoric.  This week at the ALT Scotland meeting, there was a presentation about a digital assistant (chatbot) system which has been developed by Bolton College. Aftab Hussain and his team have developed a pretty impressive system that allows staff and students to access a range of data (information) about their timetable, exams, where to find out about services.  You can read more about their work here.

This is all “exciting stuff” and seeing and hearing the real time responses was pretty impressive. The college are very lucky to have people like Aftab and his team who are able to develop this kind of system in house. I don’t think many colleges or universities for that matter have a team of developers who can do, or have the time to do this type of work.  This post is not criticising their work it is just sharing some wider questions and thoughts that it raised for me around the the development and implementation of systems like this in education.

Firstly, this system has been called ada. Throughout the presentation the system was referred to as “she”. Humanization coupled with classic gender bias stereotyping of the helpful, subservient “user friendly” female. The humanisation of such systems troubles me. The more it was referred to as “she” the more agitated I got.  Ada is not a person, it is a system linking APIs and processing data from multiple sources.

This led to questions around ethics and the who, where, what and when of any data processing. And I was glad that ethics were highlight in the presentation. But wouldn’t you know it, this is all GDPR compliant. Well most of it is , apart from some fuzziness around the use of voice activated systems like Alexa (hello Google you data loving monster).

I am increasingly seeing GDPR around institutional systems as both an assumption of privacy and data protection for users as well as a great get excuse for not doing things.  

I wonder if all the users digital assistants really understand the implications of where their data is going or how it is being used. Whilst data may be anonymized, I kind of suspect that in this case, Wolfram Alpha will be able to use patterns of queries to develop more (biased) algorithms.  

So whilst I can see the benefits of not having to trawl around website to try and find out where to get information about bursaries, timetables etc  and that many students don’t want to/ or perhaps don’t know who to ask for help. I have to say I was impressed by what must be an pretty robust institutional data architecture.  I couldn’t help who is making the decisions about what data is added to the system?   The low hanging fruit (haven’t use that phrase in a while) is all there, but what next?  

Whilst I was at the loo after the session, I noticed that there were free sanitary products – what a great idea. Sadly we have period poverty in this country, and having access to free sanitary products in colleges is wonderful. Asking about where to find free sanitary products could be quite embarrassing on several levels for lots of women. Wouldn’t that be great to be included in a digital assistant? I wonder how many typical (and by typical, I mean male) developers would think of adding that to the system, or highlighting that as a key feature of the system?  Hello, Caroline Criado-Perez  Invisible Women: Exposing data bias in a world designed for men.

Broadening understanding of digital assistants, what they really are, what the can do and what the could or should’t do from a broader perspective is, I feel, increasingly an area where education should be taking the lead. I can’t help thinking that there is an opportunity for educational developers and researchers to work with central teams like the one in Bolton to develop a similar approach to research  ethics applications for this kind of work. It’s not enough just to wave GDPR and check that data box.  

Surely that would help to broaden understandings of terms such as “risk”. In ethics applications you have be explicit about any risks to your subjects. Sharing data in this day and age is a huge risk.  

Again during the presentation it was highlighted that staff could ask the system to show them student “at risk” in their courses.  Risk in this sense was based on I presume assessment activity and VLE data. So students at risk of failing but there are lot of other nuances of risk (including mental health) that our current student data doesn’t, and quite probably shouldn’t ever be able to indicate. 

I also heard the phrase “calm technology” for the first time.  Calm or controlling?  We will drip feed you with the data we think you need, and lull you into acceptance . . . . and when you hear “I’m sorry I don’t have that information, I’m sorry I can’t answer your question” we will send you a video to divert your attention to something we think you might like based on the “calm” experiences of 6 million other users.  We will do as much as we can to divert you from speaking to an actual person as possible . . . Sorry I might have got carried away there, but there is more than a hint of “unexpected item in the bagging area” about all of this. 

So, whilst I can see the appeal of digital assistants, I really think we need to have some wider discussions and debates about just what they are, and who is involved in developing and evaluating them.  

A critical community based approach to changes in HE #arlg19 keynote

Earlier this week I was delighted to give a keynote at the Academic and Research Librarians Group annual conference at the University of Teesside, Darlington.

Information literacy is a central theme in the work I have been doing with my co- researchers and writers, Bill Johnston and Keith Smyth. So in the talk I focused in on some of the information literacy based aspects of our recent book.

A critical understanding of the information structures that are building around every aspect of our daily lives is becoming more and more important. This recent DEMOS report, Warring Sounds; Information Operations in a Digital Age, is worth a look – particularly around some of the militaristic language it uses. Control of what they term “information operations” is not just the battle ground of the future, it’s the battleground of now. Ensuring our education systems (at every stage) are developing holistic and discipline specific approaches to information literacy is key to ensuring that we all can, what the report calls defend (I prefer critically understand and question) ourselves against those who exploit and control information operations is more vital than ever.

At the edTech19 conference last week I was struck in a couple of presentations about students using of video. A couple of studies I went to showed that despite staff diligently spending time curating videos within module spaces in VLEs, students were still going to youtube if they didn’t understand “stuff”. This was causing some concern as the students had also stated that they weren’t totally confident about the veracity of the videos. When I asked in one session if this study was going to lead to including some more information literacy based sessions on evaluating video resources in that discipline, I was told that (paraphrasing here) no, not really there is some study skills material available but we really just don’t have any room in curriculum for that. We need to make room for “that”. We need to ensure that our students understand where and how information/content comes from and how to assess it. It can, and is, being done (thank you wikimedia foundation) – but we need to collectively do more.

My slides from the talk are available here.

Talking about digital transformation at #edtechIE19

Last week I had a the honour of presenting a keynote along with my colleague Keith Smyth, at the Irish Learning Technology Association (ILTA) annual edTech conference in Dundalk.

Keith and I used the framework of our new book to provide the key points for our talk which we titled “Re-imagining digital transformation through critical pedagogy”. In the talk we outlined our approach to digital transformation which situates critically informed academic development at the heart of digital transformation.

The day after the long awaited publication of the Augar review into funding for English HE/FE where the headlines were all about the money, it was timely to look developments in terms of building human agency and criticality and not just in terms of buying and charging for education as a service.

To access our slides, click on the image below.