Finding joy in ethics and criticality: reflections on #altc21

I have to admit I at the start of the conference, I felt pretty jaded. It’s been a long year. I haven’t had a proper break – that’s my own fault – not blaming anyone but me for that. And like everyone else I’ve had, and continue to have, my fair share of challenges this year. Another online conference wasn’t exactly filling me with eager anticipation.

There’s always something of the start of the new term feel about the ALT annual conferences, which is one of its strengths. That is also historically why it’s been a challenge for some people to attend the physical conferences. One positive thing about the move to is a lot more flexible, and accessible for many. Anyway, like I said, I wasn’t really feeling any excitement for anything at the start of the week – online or in the “real” world! I knew I would be dipping in and out of the conference due to work commitments, but as is so often the way with ALT conferences, it and more importantly the ALT community, slowly drew me in.

The keynotes were, as ever, very strong this year. Sonia Livingston’s “the datafication of education: in whose interests?”, focused on her research with in schools around the use and understanding of data (particularly children’s understanding of data and how it is used). The give and take of data in schools (and throughout education) is quite unbalanced. The ‘system’ takes data, often without any really questioning from students or wider society. Schools/colleges/universities, are generally trusted entities, with a (at least here in the UK ) a legal duty of care for their students. However, as more 3rd party systems are integrated in education, and more data is being given to companies, the balance is changing. They take the data and offer it back in ways that they choose. Sonia highlighted that adults often give children a false sense of trust about managing data, without highlighting that once a company has your data, despite GDPR, there is a lot it can do with it without you realising. Just what is Google/Zoom/Microsoft etc actually doing with all the extra data they have collected over lockdown for example? The need for data literacy for us all, not just kids, is increasingly important.

Data literacy was central to Mutale Nkonde’s keynote, based on her 2019 paper “Advancing racial literacy in tech” , Mutale expertly took us through the bias of AI and algorithms, highlighting in particular the racial basis in social platforms (Tiktok was cited here) with their implementation of data proxies for popularity, that clearly have historical racial bias “baked in”. Mutale encouraged us all to question and have more conversations about data, AI, algorithms. To participate in projects such a AI for the people which aims to develop and support the ethical use of data. Mutale also reminded us that algorithms are IP and so have commercial confidentiality on their side. Companies do not need to share the algorithms they use. I for one think that should be challenged more, particularly in education. If we use a AI or any 3rd party company and it is harvesting data, then part of the contract should be full disclosure around how that data is being used, so that there can be informed discussions around what patterns, historical trends, etc algorithms are being built on.

Starting these conversations can be tricky. That’s where the (launched at the conference) ALT Framework for Ethical Learning Technology might come into play too. During its launch John Traxler asked if we need to decolonise educational technology. This sparked off a bit of a debate on the ALT mailing list, so I think the answer is a clear yes! Adapting the statements in the framework to questions would be a good starting point, imho around conversations about the ethics of technology, the ethical use of data, what that actually means in context.

The highlight of the conference for me was the final keynote from Lou Mycroft. Lou is one of the founders of #JoyFE. This really did bring back my #joy. I loved Lou’s explanation of: joy as an intentional practice, of the power of being affirmingly critical, but not cynical, of quiet resistance, of the joyful militancy of embracing “the power of giving away power”. I loved the wave Lou weaved ideas around leadership, around transformation being a start not an end point, of turning values into questions. For example what would assessment look like as a practice of hope? What would timetabling look like as a practice of care? I would encourage you, dear reader to watch all the keynotes, as well as the other sessions.

For me the ALT-C conferences have always been places and spaces of joy, for sharing of ideas, for getting re-energised, and also for getting confidence from the community to continue (or start) some bits of quiet resistance. Lou proposed leadership as being more about co-ordination, not control. On reflection, I think that is strength of ALT too, it can, and does provide co-ordination for the community. The range of special interest/member groups are a living example of that.

The conference also saw the launch of the ALT/ITN co production “The Future of Learning “. Lots of “shiny” tech stuff there and worth a watch not to see the future, but to see what is happening now. Not a lot of critique of technology/AI/ data so I wonder if there were to be another episode if a theme of the ethical use of technology would be apt? That would give a space for the new framework and the work of the ALT community in this area to be highlighted. It could help raise wider awareness of the need to question how, where, why, when and with/by whom data is collected and shared. That might provide a way to show some joyful resistance and coordinated leadership can allow for more equitable, ethical, caring and joyful future for learning.

Many thanks to the conference co-chairs, the conference committee, the ALT team, and everyone who participated in the conference.

ALTC delegate open badge image

3 thoughts on “Finding joy in ethics and criticality: reflections on #altc21

  1. Good stuff Sheila,

    The idea that a person’s data should be under their control and not simply appropriated by powerful corporations/government agencies, seems an obvious candidate for inclusion in the legal system in a democracy. So, how come it happens at all? I reckon a mixture of systems design and vested interests are at work, but we are not helped by the lack of transparency and knowledge on the part of the majority of us serving to undermine effective challenges.

    1. thanks for the comment Bill. Yes I agree, GDPR was the first step in trying to get back some level of control. TBH I think lots of “us” just didn’t (and still don’t) realise how valuable data is to companies/governments.

      1. Indeed so,

        The techosphere has been so convenient and ubiquitous its hard to see how things could have gone otherwise, especially as the interfaces feel very individual and out with any sort of collective. You sit at your screen and especially during the pandemic your contacts with others are channeled by technologies like Zoom.

Leave a Reply

Your email address will not be published. Required fields are marked *