Getting ready for learning analytics at GCU (not quite #lak14)

This week I’m going to try and keep up with the twitter back channel from #lak14 in Indianapolis, already it looks like some really interesting and innovative work is being presented. However, back in my world our learning analytics journey is really just beginning. 

Over the past couple of weeks I’ve been trying to do some basic investigation, introductions and explorations of learning analytics initially with colleagues from IT and the Library.  We are very much a the who, where, why, when and how stage.  So it’s been really useful to look back at the Cetis Analytics Series and also at the presentations from the UK Solar Flare events.  As ever the generosity of the community in sharing experiences is invaluable.  This presentation from Mark Stubbs at MMU helped to clarify a few things for our IT department in terms of data sources we need along side data from the VLE.  This slide was particularly useful. 

Image

BTW we need another one of those SoLar events  soon . . . 

However we do have access to some data, particularly from our VLE, GCU Learn.  Every year we produce a Blended Learning report which gives a snapshot overview of activity in GCU Learn across the University.  Getting and cleansing the data is always a bit of a chore and we are aware that the we can only provide a superficial view of activity. I won’t go into the ins and outs of our data access and data gate-keeping issues but I suspect that you, dear reader will understand so of our “challenges”.  

In broad visual terms we have broken our blended learning activity into four main areas (click on the image to see in more detail, btw the tools/activities are just samples not a definite list for each area.)

Blended Learning areas of activity  at GCU

We can get data at school level (we have three large academic schools) but not at department or module level. Given the dates of our semesters, annual stats are not much use either as they include weeks when there is no teaching so again that can skew the data.  This year we decided to take one month, November 2013, and base the report on that.  So although what we have is a very high level overview there are some clear trends coming through. To quote the Cetis definition of analytics, these trends are indeed giving us some ‘actionable insights’ not only in terms of blended learning activity but also in terms of our wider IT  and support provision. 

So get ready here are our headline figures:

•        18% decrease in average student accesses to GCULearn via the web
•        420% increase in average student accesses to GCULearn via mobile app
•        25% increase in number of GCULearn Communities
•        82% increase in use of CampusPack blogs
•        134% increase in use of wikis
•        232% increase in use of journals
•        222% increase in online feedback via Grademark in Nov 13 compared to Nov 12
•        167% increase in online Graded papers in Nov 13 compared to Nov 12

We don’t have a mobile or byod strategy and looks like we might not need one.  It’s happening, our users are talking with their mobile devices, and 80% of those devices are iOS.  What we need to ensure is that our content is web enabled and ensure that students can interact fully with activities via mobile devices.  A “switch on” policy and, probably more importantly, culture for learning and teaching is something we need to work with staff and students to develop. Ubiquitous and stable wifi across the institution is key to this. Improvements to Bb’s mobile app would help too and we can’t wait for the roll out of their new web enabled design to be in place.  

Staff and students are using the more interactive and student centred functionality of the VLE such as wikis and journals. And the use of assessment and feedback functionality is increasing dramatically.  We estimate that 41% of our modules are making active use of GCU Learn as opposed to just having a course shell and some powerpoint slides. Now we need to drill down into that school level data to get more module level detail on the types of assignments/activities being used, and in tandem develop staff confidence in using, developing and sharing assessment rubrics and their overarching learning designs. 

We are only starting to scratch the surface of learning analytics in our context, but the data we are getting is leading us to ask more detailed questions and demand more nuanced data collection and sense making. We are starting to bring people together to have data driven conversations, and share just exactly where our data is, who has access to it, when they have access to it, what format it is in, and how they access it. We have had initial discussion with Bb about their analytics package, however we need to have more internal discussions about what we can and want to do internally before making any decisions about that.  I’m hoping that I’ll be able to share the next part of our journey very soon.

4 thoughts on “Getting ready for learning analytics at GCU (not quite #lak14)”

  1. One of the implications from the access figures for GCULearn (Bb) is that mobile devices are being used and we need to accept their use and build on them. I know mobile phones with text bleeps and ringtones are a distraction. ” Smartphones” however enable everyone to access data either to get information or to give feedback.. It looks like the days when the big switch off was a feature of lectures , are numbered. The need for granularity in analytics will identify how students are engaging with their modules, beyond the stimulus- action- response model which notifications and alerts determine.
    If/when we get this kind of data then we can spot where gains can be made for the student during their studies. In this way, we can plan and deliver an approach to learning design to enhance student learning.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php