2018 ALT Annual Conference – will you be a part of it?

picture of ALT logo

So this post is a blatant plug of this year’s ALT annual conference, being held in Manchester from 11 -13 September.  The ALT conferences are always a highlight of the UK (and increasingly international) learning technology/ educational development year. But this year is going to be even more special as it’s ALT’s 25th birthday, and not only am I the Chair of the Board, I am also co-chairing the conference with our President and my good friend, Martin Weller.

I can’t even being to express how excited Martin, myself and the whole conference committee are about the keynote speakers – Tressie McMillan Cotton, Amber Thomas and Maren Deepwell. The submissions for the conference have been of an amazingly high standard too, so it looks set to be probably the best birthday bash of the year.  Early  bird registration has just opened – so if you do want to come to the conference, and let’s face it why wouldn’t you, then head over to the conference website to get all the details.

 

 

Talking assessment data and dashboards

GCU is part of a group of 12 institutions across the UK who are taking part in a small pilot project with Jisc and Turnitin as part of the wider Effective Analytics Programme.  The project is exploring how (and more importantly) what data from Turnitin can be used effectively within the Jisc Learning Records Hub.  A key part of this work is engaging with stakeholders from across all the institutions involved in the project.  To this end, Kerr Gardiner is facilitating a series of workshops with each institution and earlier today it was our turn.

Although we can get data from Turnitin, as with quite a lot of systems, the reports that we can access are all pre-designed by Turnitin. Although we can access high level data in terms of overall numbers of submissions, marks with various types of feedback (quickmarks, audio, etc, marks using rubrics, and we can access these at module level, it’s all either huge or not so huge CSV files, and is missing some, of what we consider to be, vital data.

So it was good to have an opportunity to discuss what our needs and priorities are.  One of our key requirements, and frustrations, is that we can’t get date stamps for when assessments are uploaded and then when the marks and feedback are submitted.  Like most institutions we have an agreed feedback turnaround time, and it would be really useful to see if we are meeting that.  That data is not available to us.  It would be really good if it was.

We also had quite a bit of discussion around some of the UI issues which  relate to data too.  The new Turnitin Feedback studio interface is really user friendly, but stetting up an assignment is still quite clunky and it’s really easy to miss some of the vital parts – like the grading information. A few tweaks with that might be really useful. We also discussed having an option to mark if an assignment was formative or summative as part of the set up. That would be another really useful data set to have for whole host of reasons around assessment weighting.

We were also asked to think about dashboards. Is it just my imagination or is it illegal to have any kind of discussion about learning analytics without mentioned dashboards?   Just now our focus for assessment data is really to provide staff with more relevant access to data.  I think it terms of overall learning analytics there is an opportunity to get far greater buy in and a more nuanced discussion about data and learning when it is the context of assessment from staff.

Assessment and feedback is always high on everyone’s list, but we need to be really mindful of how and when we provide data to students around assessment due to the complex emotional impact that it can have on students.  In her recent post on student dashboards, Anne Marie Scott highlighted the need for more careful thought around the development of student dashboards.  She also refers to Liz Bennett’s recent research around student dashboards and the notion of thinking of them more as socio-material assemblages.  I really hope that part of this Jisc work will be on understanding data needs and working with staff first before rushing to add other elements to their developing student facing dashboard.

Anne Marie also highlighted the need for greater understanding and development of feedback literacy, getting students to recognise and understand what feedback is. Part of our discussions were around having a way not just to record if/when students have accessed feedback, but also a way for students to feedback on their feedback. Perhaps an emoji to indicate if they were happy with the feedback. Again access to this type of data could be really useful at a number of levels and help to start some more data informed discussions, and be a small part in the development of wider feedback literacy.

I’m looking forward to seeing how this work progresses over the coming months, and thanks to Kerr for his informed facilitation of the session – and of course for introducing us to giant post-it notes.

 

photo of giant post it notes

 

 

 

Collaborative clustering around learning analytics

picture of enhancement themes diagram

Early this week I attended the second meeting of the QAA Scotland Enhancement Theme learning analytics cluster, hosted by Ainsley Hainey, Brian Green and Helen Gough  (University of Strathclyde).  The enhancement themes are a key distinguishing aspect of Scottish HE in relation to quality measures around learning teaching and sectoral sharing of practice. Staff and students actively collaborate across the sector during the three year theme life-cycles and they are a really important part of university life.

The current theme is Evidence for Enhancement: Improving the Student Experience.   Learning analytics is one of three community clusters, the others being the creative disciplines and employability and distance learning.  From the names of the clusters alone you can start to get an idea of how broadly the theme is investigating the notion of evidence for enhancement. This is not just a numbers exercise!

Unfortunately I missed the first cluster workshop earlier this year, however the outputs from it formed the basis for the discussions on how to progress the work of the cluster, the most valuable outputs relating to focus on, and to start to scope out areas of work for student interns.

Quite a lot of the discussions yesterday came back to the need for appropriate definitions and clarity of scope for any learning analytics initiative.  This was something we certainly we very aware of when we developed our learning analytics policy, and we used the SoLAR definition

the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs

Anne Marie Scott, from the University of Edinburgh also reminded us of the principles that Edinburgh have developed, along with their data governance structures. Very useful stuff indeed.  It’s also worth looking at Anne Marie’s recent blog post on student dashboards.

Whilst there is an undeniable interest in learning analytics across the Scottish sector, uptake and progress is still patchy with many of us still at the very early stages of work.  The Enhancement Themes approach does give us a very distinctive, and in many ways a far richer, approach to quality measures in relation to learning and teaching.  It looks like this will continue in terms of the outputs of the learning analytics collective cluster.  I’m really looking forward to being involved in this work as it moves forward over the next few months.

The Magna Charta Universitatum – looking back at where universities c/should be

The Magna Charta Universitatum may sound a bit Harry Potterish, but it is an actual thing.  I was directed to it from an article by Stephen Collini in The Guardian last week around university integrity.

The charter was developed in 1988 as a guide for European Universities as a blue print for the future.

It contains principles of academic freedom and institutional autonomy as a guideline for good governance and self-understanding of universities in the future.

Academic freedom is the foundation for the independent search for truth and a barrier against undue intervention for both government and interest groups.

Institutional autonomy is a prerequisite for the effective and efficient operations of modern universities.

It also underlies the unique constellation of study, teaching and research, as represented by the European university for the last millennium, and must be further developed without abandoning these universal principles.

The universities now refer to this text as the standard of their belonging to an international community sharing the same academic values and purposes

As Collini highlighted it was signed by the education ministers of 29 countries. 33 UK universities including my own also signed the charter. I wonder how many current university Vice Chancellors still try to adhere to these principles or indeed, are even aware of its existence, let alone those working in Government departments responsible for Higher Education.

At the recent OER18 conference, Keith Smyth (UHI) and I presented a summary of our ongoing work and forthcoming book around the concept of the digital university.  One of the key questions we are exploring in the book, based again on work from Collini, is to try and help define an understanding of what (and who) a university is for in our digital age.  It seems to us, that currently, in the UK at least, universities and governments understandings and expectations of universities are moving further and further away from those detailed in the charter.  As Brexit looms ever closer, perhaps it’s time we all took a closer look at it.

 

css.php