I wish I'd said that . . . reflections from #digifest17

You know how it is, despite how much you plan for a debate/live speaking situation,  there’s always something that pops in to your head on the train home that makes you think, “oh I wish I’d said that.”  Since last week’s digifest I have had several of those moments.

As I wrote about last week, I took part in the “do analytics interventions always need to be mediated by humans” debate.  I was defending that  motion. I tried to explain my thoughts in this post.  Richard Palmer from Tribal put up a strong case taking the other view. In the end, despite me claiming a Trump like spectacular, popular victory ( Many people said so), the final vote was pretty close.  Due mainly to the word “always” and Richard’s pretty convincing argument that there are some alerts and “low level” interventions  can be automated and so do not “always” need human intervention.

However, of course they do. The final intervention/ action from any alert, analytics intervention has to be mediated by a human. In the context of our debate that means a student actually doing something as a direct result of that intervention. I wish I’d said that. And if students just ignore the automated alerts/interventions – what then? Are we measuring and monitoring that?  And what if all the power goes off?  What about alerts then? What happens when a student challenges the alert system for allowing s/he to fail? Oh, I wish I had said that  . . .

We do already alert students in a number of ways and we need to ensure we are having a dialogue with students so that we all understand what are the things that are actually motivating, and keep being motivating so that any student apps/alert systems we do produce don’t just suffer from the fitbit syndrome where obsession doesn’t actually lead to motivation but to disengagement.

The other thing – well it’s actually a word – that I wish I had said was “praxis”.  Part of my argument was to (very quickly and I confess somewhat superficially as I didn’t have a huge amount of time to prepare for the debate) draw some comparisons with learning  analytics and Freire’s  seminal Pedagogy of the Oppressed.  I did want to get the notion of praxis into the debate but on the day it didn’t quite happen.  However Mahi Bali picked this up over the weekend and commented on my blog.

“great title, Sheila, and bringing in Paulo Freire inside it is an additional bonus! I love where you’re going with this but would love it if you had the opportunity to take it further into more of Friere’s ideas with regards to praxis, consciousness-raising and empowerment of the oppressed. . . .What I think is interesting is the thinking of Paul Prinsloo on how to decolonize learning analytics such that learners possibly hold more power/control over their data and how it’s used. This could be a third path…

I couldn’t agree more. I think it really is time to discuss praxis in this context. Which brings me back to the core part of my argument last week. We need to have more debate and dialogue around learning analytics and the theoretical approaches we using to frame those dialogues.

I know this is a sweeping generalisation, please forgive me dear reader, but I do worry that emerging design models, partly driven by more fully online delivery, are defaulting to the now seemingly standard: read/watch, quiz, bit of “lite” discussion on the side of the page, badge/certificate  and repeat.  They are easy to measure, to “alert-ify”.  But they are not always the best educational experience.

I missed LAK this year and only so a few tweets so I’m sure that there is a lot of work going on a much higher levels in the learning analytics community. However there is still the nagging feeling in the back of my brain that discussing bayesian regression modelling is still quite dominant. I know last year at LAK there was a concerted effort to work with the learning sciences community, to bring in more learning theory.  But reflecting on last week, it seems to me that behaviourism is going to become (even more) embedded in our systems, in our KPIs, without us actually realising it or having the chance to have a an informed dialogue with our practising teachers and students. A post from Doug Clow from back in 2011,  springs to  mind, is the sinister sausage machine here?

Learning analytics, at least in digifest terms, seems to be the current “future now”.  There were so many session with it as their main theme, it was hard to avoid it. On the one hand I think this is great to see. The debate, the dialogues I have been arguing for are being given a chance to begin. We just need to ensure that they are given enough critical space to continue.  And to that end I guess I should get my “butt in to action” and maybe take a bit more time to write something a bit more informed about praxis.  In the meantime here’s a short interview where Richard and I try to summarise our debate.

9 thoughts on “I wish I'd said that . . . reflections from #digifest17

  1. Your debate with Richard was one of my favourite sessions at Digifest. I’m going to blog some of my thoughts on my team’s TEL blog and also my own personal blog over the next few weeks.

    I agreed with pretty much everything you said and think that universities need to proceed with caution when it comes to Learning Analytics. I fear that many will see it as a silver bullet to solve a multitude of complicated problems and it just won’t have the desired effects people want. I’m very sceptical.

  2. Hi Sheila, I just briefly ran across your blog as I was perusing some recommendations from elearningindustry.com and this debate between humans vs machines intrigues me to no end. I didn’t watch any video of you as yet but I wonder if anyone brought up Neal Stephenson’s https://en.wikipedia.org/wiki/The_Diamond_Age. I believe his vision for the future (even more so now that he works for Magic Leap). Imagine a child having a companion device from morning til naptime listening & talking with the child, being able to see & hear everything the child does in the real world. Then this companion device can create entertaining mixed reality stories the child wants that actually teaches how to solve real world problems in the real world all the while encouraging bravery and inquisitiveness.

    I was amazed as I read Stephenson’s book, written in the 90s. I just feel like it would be a dream to be able to have the best teacher with you all day. I can foresee teachers digitizing themselves, their looks, their voice, their choices and proclivities and parents choosing to buy the most well-respected digital teacher that will come into view with augmented/mixed reality like the Hololens or Magic Leap.

    Sure, all of this may be a 5 or 10 years away, but it is almost here and teacher’s can hasten its arrival. You mentioned Paulo Freire in your responses and I don’t disagree with anything you’ve said. It’s just the future is the merger human & AI, the digital Freire, if you will.

    The problem will probably be the gap between rich and poor, the haves and have nots. Ideally every African child should have the same quality education as the children of Silicon Valley billionaires but this is hard to do. Digitizing the best should make this easier because quality won’t degrade as you make infinite copies. Exciting idea isn’t it?

    What are your thoughts?

    1. Hi Leonard – thanks for taking the time to post your comment. No one brought up the book during the debate but having read the book, like you it is never far from my mind. I still think we have to be cautious – remember not everyone in Stephenson’s world had access to this educational experience – and one of the main narrative drivers is the fact that this particular “primer” ended up with the wrong young lady! I still worry that our AI isn’t quite there and could develop with our current inherent biases.

      1. AI must develop with our biases in order for us to recognise it as “I”. And this is the problem. Actual machine consciousness is likely to be unrecognisable to us, should it ever evolve. And who knows if it has or has not? We think and feel the way we do because of our embodiment. For all we know machines have been weeping over our foolish cruelty for years, or care so little about us that we are as if terra nullius.

Leave a Reply

Your email address will not be published. Required fields are marked *