Early this week I attended the second meeting of the QAA Scotland Enhancement Theme learning analytics cluster, hosted by Ainsley Hainey, Brian Green and Helen Gough (University of Strathclyde). The enhancement themes are a key distinguishing aspect of Scottish HE in relation to quality measures around learning teaching and sectoral sharing of practice. Staff and students actively collaborate across the sector during the three year theme life-cycles and they are a really important part of university life.
The current theme is Evidence for Enhancement: Improving the Student Experience. Learning analytics is one of three community clusters, the others being the creative disciplines and employability and distance learning. From the names of the clusters alone you can start to get an idea of how broadly the theme is investigating the notion of evidence for enhancement. This is not just a numbers exercise!
Unfortunately I missed the first cluster workshop earlier this year, however the outputs from it formed the basis for the discussions on how to progress the work of the cluster, the most valuable outputs relating to focus on, and to start to scope out areas of work for student interns.
Quite a lot of the discussions yesterday came back to the need for appropriate definitions and clarity of scope for any learning analytics initiative. This was something we certainly we very aware of when we developed our learning analytics policy, and we used the SoLAR definition
the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs
Anne Marie Scott, from the University of Edinburgh also reminded us of the principles that Edinburgh have developed, along with their data governance structures. Very useful stuff indeed. It’s also worth looking at Anne Marie’s recent blog post on student dashboards.
Whilst there is an undeniable interest in learning analytics across the Scottish sector, uptake and progress is still patchy with many of us still at the very early stages of work. The Enhancement Themes approach does give us a very distinctive, and in many ways a far richer, approach to quality measures in relation to learning and teaching. It looks like this will continue in terms of the outputs of the learning analytics collective cluster. I’m really looking forward to being involved in this work as it moves forward over the next few months.
#cetis14 was a slightly different experience for me this year, as for the first time I attended as a delegate not one of the organisers, and someone who wanted to find out about what I should be looking towards in terms of innovation and key trends. As the conference theme was “Building the digital institution” I was particularly looking forward to insights on that and how it fitted with my thinking on that area.
Last week we hosted one of the Jisc Digital Student consultation events, in her summary of findings so far, Helen Beetham highlighted the importance of “space and place” to students at university. During his opening talk, Paul Hollins showed a video his 12 year old son had made about his vision of a digital institution, and I was struck how important “place” (even if it was virtual) was to him too. His simulation centered on distinct buildings,not that dis-similar to many current university campuses (minus the holo-decks). However it was a different kind of space that has left me feeling a bit bewildered about innovation and the future developments.
Giving a Cetis keynote can be a bit of a challenge. It can be a bit of a “tough crowd”, so Phil Richards, CIO, Jisc who gave the first keynote had his work cut out for him. Jisc has been “evolving” and restructuring for a couple of years now. Part of that restructuring has seen Cetis evolve too from a fully funded Jisc Innovation Support Centre, to a self funding centre. I was looking forward to hearing what and how Jisc will be working with the sector. However, I am still a bit confused.
I know that the “old” funding mechanisms at Jisc weren’t perfect, but I’m not sure almost 3 years later Jisc still need to be referencing the Wilson report so heavily to justify changes. The million seeds left to flower analogy was used, and again I agree that in the past, some Jisc funded projects were much more successful than others and some, despite lots of funding did wither and die. Now it seems Jisc have been thinning the trees, and are now concentrating on maintaining a more manageable forest. There will be a nursery but just now there will be four large seeds (research, analytics, student information systems, and digital leadership). These have been decided through a process of co-design with key stakeholders. Through this process over the next three years Jisc will be developing its Product Catalogue and then we (the sector) can sign up to the new subscription model. Well I think that’s what he said. . . I’m still a bit confused about how the co-design process is extended to the sector, or how for example I could tell my PVC of Learning and Teaching how we can become involved in the process. I really like the idea of co-design, the most successful Jisc projects have always had that element in them. I’m just still unclear how it will actually work in the “new” Jisc context.
In terms of innovation, it was useful to revisit the innovation, service, commodity cycle but again I was left feeling that if all services eventually become commodities then what is the value propositions of developing shared services just now if someone else will be able to provide them cheaper than we as a sector can . . . Again I agree light touch specifications like LTI are really good, but they need to be nurtured too – and someone usually pays for that.
This is my visual note of the session (apparently there were unicorns somewhere but I missed them, and they are quite hard to draw)
I also went to the Developing Learning Analytics Strategy for HEI session. Again more head scratching. I suppose I was just hoping someone would tell me what to do ( I know, nothing is that easy!!). Actually trying to map out the steps of developing a strategy was useful. Even if it did just confirm what a huge job that is. Being pragmatic for me I need a quick win, to get Senior Management buy-in and then we can start thinking about strategies.
So once again I am thinking about time. Jisc seem to be spending lots of time wandering around their forest, but where are the entry paths/sign posts for the sector? When will they open the gates? What will be in their product catalogue? How much will it cost? Where do I look to for the fledgling seeds of innovation? Will I have time to wait for the new seeds in the nursery to flower? I was sorry to miss day 2 of the conference and Audrey Watter’s keynote, but I’ll catch up on that via twitter now.
Thanks to everyone at Cetis for organising the conference and bringing such an interesting and inspiring group of people together. My final thought – are Cetis conferences now our equivalent of pop-up innovation centres?
Data, data everywhere data, but what do we actually do with it? Do we need “big” data in education? What is it we are trying to find out? What is our ROI both at institutional and national levels? Just some the questions that were raised at the Analytics and Institutional Capabilities session at #cetis13 last week.
Is data our new oil? asked Martin Hawksey in his introduction to the session. And if, as many seem to think, it is, do we we really have the capabilities to “refine” it properly? How can we ensure that we aren’t putting the equivalent of petrol into a diesel engine? How can we ensure that institutions (and individuals) don’t end getting trapped in a dangerous slick of data? Are we ensuring that everyone (staff and students) are developing the data literacy skills they need to use and ultimately understand the visualisations we can produce from data?
Ranjit Sidhu (Statistics into Decisions) gave an equally inspiring and terrifying presentation around the hype of big data. He pointed out that in education “local data” and not “big data” is really where we should be focusing our attention, particularly in relation to our core business of attracting students. In relation to national level data he also questions the ROI on some “quite big” data national data collection activities such as the KIS. From the embarrassingly low figures he showed us of the traffic to the UniStats site, it would appear not. We may have caused a mini spike in the hits for one day in March 🙂
However, there are people who are starting to ask the right questions and use their data in ways that are meaningful. A series of lightning talks which highlighted a cross section of approaches to using institutional data. This was followed by three inspiring talks from Jean Mutton (University of Derby), Mark Stubbs (MMU) and Simon Buckingham Shum (OU). Jean outlined the work she and her team have been doing at Derby on enhancing the student experience (more information on this is available through our new case study); Mark then gave a review of the work they have been doing around deeper exploration of NSS returns data and their VLE data. Both Jean and Mark commented that their work started without them actually realising they were “doing analytics”. Marks analytics cycle diagram was really useful in illustrating their approach.
Simon, on the other hand, of course very much knew that he was “doing analytics” and gave an overview of some the learning analtyics work currently being undertaken at the OU, including a quick look at some areas FutureLearn could potentially be heading.
Throughout all the presentations the key motivator has, and continues to be, framing and then developing the “right” questions to get the most out of data collection activity and analysis.
More information including links to the slides from the presentations are available on the CETIS website.
Our first set of papers around analytics in education has been published, and with nearly 17,000 downloads, it would seem that there is an appetite for resources around this topic. We are now moving onto phase of our exploration of analytics and accompanying this will be a range of outputs including some more briefing papers and case studies. Volume 1 took a high level view of the domain, volume 2 will take a much more user centred view including a number of short case studies sharing experiences of a range of early adopters who are exploring the potential of taking a more analytics based approach.
The first case study features Jean Mutton, Student Experience Project Manager, at the University of Derby. Jean shares with us how her journey into the world of analytics started and how and where she and the colleagues across the university she has been working with, see the potential for analytics to have an impact on improving the student experience.
We have a number of other case studies identified which we’ll be publishing over the coming months, however we are always looking for more examples. So if you are working with analytics have some time to chat with us, we’d love to hear from you and share your experiences in this way too. Just leave a comment or email me (email@example.com).
Following on from yesterday’s post, another “thought bomb” that has been running around my brain is something far closer to the core of Audrey’s “who owns your educational data?” presentation. Audrey was advocating the need for student owned personal data lockers (see screen shot below). This idea also chimes with the work of the Tin Can API project, and closer to home in the UK the MiData project. The latter is more concerned with more generic data around utility, mobile phone usage than educational data, but the data locker concept is key there too.
As you will know dear reader, I have turned into something of a MOOC-aholic of late. I am becoming increasingly interested in how I can make sense of my data, network connections in and across the courses I’m participating in and, of course, how I can access and use the data I’m creating in and across these “open” courses.
I’m currently not very active member of the current LAK13 learning analytics MOOC, but the first activity for the course is, I hope, going to help me frame some of the issues I’ve been thinking about in relation to my educational data and in turn my personal learning analytics.
Using the framework for the first assignment/task for LAK13, this is what I am going to try and do.
1. What do you want to do/understand better/solve?
I want to compare what data about my learning activity I can access across 3 different MOOC courses and the online spaces I have interacted in on each and see if I can identify any potentially meaningful patterns, networks which would help me reflective and understand better, my learning experiences. I also want to explore see how/if learning analytics approaches could help me in terms of contributing to my personal learning environment (PLE) in relation to MOOCs, and if it is possible to illustrate the different “success” measures from each course provider in a coherent way.
2. Defining the context: what is it that you want to solve or do? Who are the people that are involved? What are social implications? Cultural?
I want to see how/if I can aggregate my data from several MOOCs in a coherent open space and see what learning analytics approaches can be of help to a learner in terms of contextualising their educational experiences across a range of platforms.
This is mainly an experiment using myself and my data. I’m hoping that it might start to raise issues from the learner’s perspective which could have implications for course design, access to data, and thoughts around student created and owned eportfolios/and or data lockers.
3. Brainstorm ideas/challenges around your problem/opportunity. How could you solve it? What are the most important variables?
I’ve already done some initial brain storming around using SNA techniques to visualise networks and connections in the Cloudworks site which the OLDS MOOC uses. Tony Hirst has (as ever) pointed the way to some further exploration. And I’ll be following up on Martin Hawksey’s recent post about discussion group data collection .
I’m not entirely sure about the most important variables just now, but one challenge I see is actually finding myself/my data in a potentially huge data set and finding useful ways to contextualise me using those data sets.
4. Explore potential data sources. Will you have problems accessing the data? What is the shape of the data (reasonably clean? or a mess of log files that span different systems and will require time and effort to clean/integrate?) Will the data be sufficient in scope to address the problem/opportunity that you are investigating?
The main issue I see just now is going to be collecting data but I believe there some data that I can access about each MOOC. The MOOCs I have in mind are primarily #edc (coursera) and #oldsmooc (OU). One seems to be far more open in terms of potential data access points than the other.
There will be some cleaning of data required but I’m hoping I can “stand on the shoulders of giants” and re-use some google spreadsheet goodness from Martin.
I’m fairly confident that there will be enough data for me to at least understand the problems around the challenges for letting learners try and make sense of their data more.
5. Consider the aspects of the problem/opportunity that are beyond the scope of analytics. How will your analytics model respond to these analytics blind spots?
This project is far wider than just analytics as it will hopefully help me to make some more sense of the potential for analytics to help me as a learner make sense and share my learning experiences in one place that I chose. Already I see Coursera for example trying to model my interactions on their courses into a space they have designed – and I don’t really like that.
I’m thinking much more about personal aggregation points/ sources than the creation of actual data locker. However it maybe that some existing eportfolio systems could provide the basis for that.
Week 5 in #oldsmooc has been all about prototyping. Now I’ve not quite got to the stage of having a design to prototype so I’ve gone back to some of my earlier thoughts around the potential for Cloudworks to be more useful to learners and show alternative views of community, content and activities. I really think that Cloudworks has potential as a kind of portfolio/personal working space particularly for MOOCs.
As I’ve already said, Cloudworks doesn’t have a hierarchical structure, it’s been designed to be more social and flexible so its navigation is somewhat tricky, particularly if you are using it over a longer time frame than say a one or two day workshop. It relies on you as a user to tag and favourite clouds and cloudscapes, but even then when you’re involved in something like a mooc that doesn’t really help you navigate your way around the site. However cloudworks does have an open API and as I’ve demonstrated you can relatively easily produce a mind map view of your clouds which makes it a bit easier to see your “stuff”. And Tony Hirst has shown how using the API you can start to use visualisation techniques to show network veiws of various kinds.
In a previous post I created a very rough sketch of how some of Tony’s ideas could be incorporated in to a user’s profile page.
As part of the prototyping activity I decide to think a bit more about this and use Balsamiq (one of the tools recommended to us this week) to rough out some ideas in a bit more detail.
The main ideas I had were around redesigning the profile page so it was a bit more useful. Notifications would be really useful so you could clearly see if anything had been added to any of your clouds or clouds you follow – a bit like Facebook. Also one thing that does annoy me is the order of the list of my clouds and cloudscapes – it’s alphabetical. But what I really want at the top of the list is either my most recently created or most active cloud.
In the screenshot below you can see I have an extra click and scroll to get to my most recent cloud via the clouds list. What I tend to do is a bit of circumnavigation via my oldsmooc cloudscape and hope I have add my clouds it it.
I think the profile page could be redesigned to make use of the space a bit more (perhaps lose the cloud stream, because I’m not sure if that is really useful or not as it stands), and have some more useful/useble views of my activity. The three main areas I thought we could start grouping are clouds, cloudscapes (and they are already included) and add a community dimension so you can start to see who you are connecting with.
My first attempt:
Now but on reflection – tabs not a great idea and to be honest they were in the tutorial so I that’s probably why I used them 🙂
But then I had another go and came up something slightly different. Here is a video where I explain my thinking a bit more.
and you can see more comments in my cloud for the week as well as take 1 of the video.
This all needs a bit more thought – particularly around what is actually feasible in terms of performance and creating “live” visualisations, and indeed about what would actually be most useful. And I’ve already been in conversation with Juliette Culver the original developer of Cloudworks about some of the more straight forward potential changes like the re-ordering of cloud lists. I do think that with a bit more development along these lines Cloudworks could become a very important part of a personal learning environment/portfolio.
Following on from last week’s post on the #edcmooc, the course itself has turned to explore the notion of MOOCs in the context of utopian/dystopian views of technology and education. The questions I raised in the post are still running through my mind. However they were at a much more holistic than personal level.
This week, I’ve been really trying to think about things from my student (or learner) point of view. Are MOOCs really changing the way I engage with formal education systems? On the one hand yes, as they are allowing me (and thousands of others) to get a taste of courses from well established institutions. At a very surface level who doesn’t want to say they’ve studied at MIT/Stanford/Edinburgh? As I said last week, there’s no fee so less pressure in one sense to explore new areas and if they don’t suit you, there’s no issue in dropping out – well not for the student at this stage anyway. Perhaps in the future, through various analytical methods, serial drop outs will be recognised by “the system” and not be allowed to join courses, or have to start paying to be allowed in.
But on the other hand, is what I’m actually doing really different than what I did at school and when I was an undergraduate or was a student on “traditional’ on line, distance courses. Well no, not really. I’m reading selected papers and articles, watching videos, contributing to discussion forums – nothing I’ve not done before, or presented to me in a way that I’ve not seen before. The “go to class” button on the Coursera site does make me giggle tho’ as it’s just soo American and every time I see it I hear a disembodied American voice. But I digress.
The element of peer review for the final assignment for #edcmooc is something I’ve not done as a student, but it’s not a new concept to me. Despite more information on the site and from the team this week I’m still not sure how this will actually work, and if I’ll get my certificate of completion for just posting something online or if there is a minimum number of reviews I need to get. Like many other fellow students the final assessment is something we have been concerned about from day 1, which seemed to come as a surprise to some of the course team. During the end of week 1 google hang out, the team did try to reassure people, but surely they must have expected that we were going to go look at week 5 and “final assessment” almost before anything else? Students are very pragmatic, if there’s an assessment we want to know as soon as possible the where,when, what, why, who,how, as soon as possible. That’s how we’ve been trained (and I use that word very deliberately). Like thousands of others, my whole education career from primary school onwards centred around final grades and exams – so I want to know as much as I can so I know what to do so I can pass and get that certificate.
That overriding response to any kind of assessment can very easily over-ride any of the other softer (but just as worthy) reasons for participation and over-ride the potential of social media to connect and share on an unprecedented level.
As I’ve been reading and watching more dystopian than utopian material, and observing the general MOOC debate taking another turn with the pulling of the Georgia Tech course, I’ve been thinking a lot of the whole experimental nature of MOOCs. We are all just part of a huge experiment just now, students and course teams alike. But we’re not putting very many new elements into the mix, and our pre-determined behaviours are driving our activity. We are in a sense all just ghosts in the machine. When we do try and do something different then participation can drop dramatically. I know that I, and lots of my fellow students on #oldsmooc have struggled to actually complete project based activities.
The community element of MOOCs can be fascinating, and the use of social network analysis can help to give some insights into activity, patterns of behaviour and connections. But with so many people on a course is it really possible to make and sustain meaningful connections? From a selfish point of view, having my blog picked up by the #edcmooc news feed has greatly increased my readership and more importantly I’m getting comments which is more meaningful to me than hits. I’ve tried read other posts too, but in the first week it was really difficult to keep up, so I’ve fallen back to a very pragmatic, reciprocal approach. But with so much going on you need to have strategies to cope, and there is quite a bit of activity around developing a MOOC survival kit which has come from fellow students.
As the course develops the initial euphoria and social web activity may well be slowing down. Looking at the twitter activity it does look like it is on a downwards trend.
Monitoring this level of activity is still a challenge for the course team and students alike. This morning my colleague Martin Hawskey and I were talking about this, and speculating that maybe there are valuable lessons we in the education sector can learn from the commercial sector about managing “massive” online campaigns. Martin has also done a huge amount of work aggregating data and I’d recommend looking at his blogs. This post is a good starting point.
Listening to the google hang out session run by the #edcmooc team they again seemed to have under estimated the time sink reality of having 41,000 students in a course. Despite being upfront about not being everywhere, the temptation to look must be overwhelming. This was also echoed in the first couple of weeks of #oldsmooc. Interestingly this week there are teaching assistants and students from the MSc course actively involved in the #edcmooc.
I’ve also been having a play with the data from the Facebook group. I’ve had a bit of interaction there, but not a lot. So despite it being a huge group I don’t get the impression, that apart from posting links to blogs for newsfeed, there is a lot of activity or connections. Which seems to be reflected in the graphs created from the data.
This is a view based on friends connections. NB it was very difficult for a data novice like me to get any meaningful view of this group, but I hope that this gives the impression of the massive number of people and relative lack of connections.
There are a few more connections which can be drawn from the interactions data, and my colleagye David Sherlock manage create a view where some clusters are emerging – but with such a huge group it is difficult to read that much into the visualisation – apart from the fact that there are lots of nodes (people).
I don’t think any of this is unique to #edcmooc. We’re all just learning how to design/run and participate at this level. Technology is allowing us to connect and share at a scale unimaginable even 10 years ago, if we have access to it. NB there was a very interesting comment on my blog about us all being digital slaves.
Despite the potential affordances of access at scale it seems to me we are increasingly just perpetuating an existing system if we don’t take more time to understand the context and consequences of our online connections and communities. I don’t need to connect with 40,000 people but I do want to understand more about how, why and how I could/do. That would be a really new element to add to any course, not just MOOCs (and not something that’s just left to a course specifically about analytics). Unless that happens my primary driver will be that “completion certificate”. In this instance, and many others, to get that I don’t really need to make use of the course community. So I’m just perpetuating an existing where I know how to play the game, even if it’s appearance is somewhat disguised.
Well I have survived week 1 of #oldsmooc and collected my first online badge for doing so -#awesome. My last post ended with a few musings about networks and visualisation.
I’m also now wondering if a network diagram of cloudscape (showing the interconnectedness between clouds, cloudscapes and people) would be helpful ? Both in terms of not only visualising and conceptualising networks but also in starting to make more explicit links between people, activities and networks. Maybe the mindmap view is too linear? Think I need to speak to @psychemedia and @mhawskey . . .
I’ve been really pleased that Tony Hirst has taken up my musings and has been creating some wonderful visualisations of clouds, cloudscapes and followers. So I should prefix prefix this post by saying that this has somewhat distracted me from the main course activities over the past few days. However I want to use this post to share some of my thoughts re these experiments in relation to the context of my learning journey and the potential for Cloudworks to help me (and others) contextualise their learning, activities, networks, and become a powerful personal learning space/ environment.
Cloudworks seems to be a bit like marmite – you either love or hate it. I have to admit I have a bit of a soft spot for it mainly because I have had a professional interest in its development.(I also prefer vegemite but am partial to marmite now and again). I’ve also used it before this course and have seen how it can be useful. In someways it kind of like twitter, you have to use it to see the point of using it. I’ve also fully encouraged the development of its API and its open source version Cloud Engine.
A short bit of context might be useful here too. Cloudworks was originally envisaged as a kind of “flickr for learning designs”, a social repository if you like. However as it developed and was used, it actually evolved more into an aggregation space for ideas, meetings, conferences. The social element has always been central. Of course making something social, with tagging, favouring etc, does mean that navigation isn’t traditional and is more “exploratory” for the user. This is the first time (that I know of anyway) it has actually been used as part of a “formal” course.
As part of #oldsmooc, we (the leaners) are being encouraged to use Cloudworks for sharing our learning and activities. As I’m doing a bit more on the course, I’m creating clouds, adding them to my own #oldsmooc and other cloudscapes, increasingly favouriting and following other’s clouds/cloudscapes. I’m starting to find that concept of having one place where my activity is logged and I am able to link to other spaces where I create content (such as this blog) is becoming increasingly attractive. I can see how it could really help me get a sense of my learning journey as I process through the course, and the things that are useful/of interest to me. In other words, it’s showing potential to be my personal aggregation point, and a very useful (if not key) part of my personal learning environment. But the UI as it stands is still a bit clunky. Which is where the whole visualisation thing started.
Now Tony has illustrated how it possible to visualise the connections between people, content, activities, what think would be really useful would be an incorporation of these visualisations into a newly designed profile page. Nick Frear has already done an alpha test to show these can be embedded into Cloudworks.
Excuse the very crude graphic cut and paste but I hope you get the idea. There’s lots of space there to move things around and make it much more user friendly and useful.
Ideally when I (or any other user) logged into our profile page, our favourite spaces and people could easily been seen, and we could have various options to see and explore other network views of people/and our content and activities. Could these network views start to give learners a sense of Dave Cormier’s rhizomatic learning; and potentially a great level of control and confidence in exploring the chaotic space which any MOOC creates?
The social “stuff” and connections is all there in Cloudworks, it just needs a bit of re-jigging. If the UI could be redesigned to incorporate these ideas , then I for one would be very tempted to use cloud works for any other (c)MOOC I signed up for. I also need to think a lot more about how to articulate this more clearly and succinctly, but I’d be really interested in other views.
Late last month the Larnaca Declaration on Learning Design was published. Being “that time of year” I didn’t get round to blogging about it at the time. However as it’s the new year and as the OLDS mooc is starting this week, I thought it would be timely to have a quick review of the declaration.
The wordle gives a flavour of the emphasis of the text.
First off, it’s actually more of a descriptive paper on the development of research into learning design, rather than a set of statements declaring intent or a call for action. As such, it is quite a substantial document. Setting the context and sharing the outcomes of over 10 years worth of research is very useful and for anyone interested in this area I would say it is definitely worth taking the time to read it. And even for an “old hand” like me it was useful to recap on some of the background and core concepts. It states:
“This paper describes how ongoing work to develop a descriptive language for teaching and learning activities (often including the use of technology) is changing the way educators think about planning and facilitating educational activities. The ultimate goal of Learning Design is to convey great teaching ideas among educators in order to improve student learning.”
One of my main areas of involvement with learning design has been around interoperability, and the sharing of designs. Although the IMS Learning Design specification offered great promise of technical interoperability, there were a number of barriers to implementation of the full potential of the specification. And indeed expectations of what the spec actually did were somewhat over-inflated. Something I reflected on way back in 2009. However sharing of design practice and designs themselves has developed and this is something at CETIS we’ve tried to promote and move forward through our work in the JISC Design for Learning Programme, in particular with our mapping of designs report, the JISC Curriculum Design and Delivery Programmes and in our Design bashes: 2009, 2010, 2011. I was very pleased to see the Design Bashes included in the timeline of developments in the paper.
James Dalziel and the LAMS team have continually shown how designs can be easily built, run, shared and adapted. However having one language or notation system is a still goal in the field. During the past few years tho, much of the work has been concentrated on understanding the design process and how to help teachers find effective tools (online and offline) to develop new(er) approaches to teaching practice, and share those with the wider community. Viewpoints,LDSE and the OULDI projects are all good examples of this work.
The declaration uses the analogy of the development of musical notation to explain the need and aspirations of a design language which can be used to share and reproduce ideas, or in this case lessons. Whilst still a conceptual idea, this maybe one of the closest analogies with universal understanding. Developing such a notation system, is still a challenge as the paper highlights.
The declaration also introduces a Learning Design Conceptual Map which tries to “capture the broader education landscape and how it relates to the core concepts of Learning Design“.
These concepts including pedagogic neutrality, pedagogic approaches/theories and methodologies, teaching lifecycle, granularity of designs, guidance and sharing. The paper puts forward these core concepts as providing the foundations of a framework for learning design which combined with the conceptual map and actual practice provides a “new synthesis for for the field of learning design” and future developments.
So what next? The link between learning analytics and learning design was highlighted at the recent UK SoLAR Flare meeting. Will having more data about interaction/networks be able to help develop design processes and ultimately improving the learning experience for students? What about the link with OERs? Content always needs context and using OERs effectively intrinsically means having effective learning designs, so maybe now is a good time for OER community to engage more with the learning design community.
The Declaration is a very useful summary of where the Learning Design community is to date, but what is always needed is more time for practising teachers to engage with these ideas to allow them to start engaging with the research community and the tools and methodologies which they have been developing. The Declaration alone cannot do this, but it might act as a stimulus for exisiting and future developments. I’d also be up for running another Design Bash if there is enough interest – let me know in the comments if you are interested.
The OLDS MOOC is a another great opportunity for future development too and I’m looking forward to engaging with it over the next few weeks.
This is my final post on my experiences of the #moocmooc course that ran last week, and I want to share a few of my reflections on the role of analytics (and in this case learning analytics), primarily from my experiences as a learner on the course. I should point out that I have no idea about the role of analytics from the course teams point of view, but I am presuming that they have the baseline basics of enrollment numbers and login stats from the Canvas LMS. But in this instance there were no obvious learner analytics available from the system. So, as a learner, in such an open course where you interact in a number of online spaces, how do you get a sense of your own engagement and participation?
There are some obvious measures, like monitoring your own contributions to discussion forums. But to be honest do we really have the time to do that? I for one am quite good at ignoring any little nagging voices saying in my head saying “you haven’t posted to the discussion forum today” 🙂 A little automation would probably go a long way there. However, a lot of actual course activity didn’t take place within the “formal” learning environment, instead it happened in other spaces such as twitter, storify, google docs, YouTube, blogs etc. Apart from being constantly online, my phone bleeping every now again notifying me of retweets, how did I know what was happening and how did that help with engagement and motivation?
I am fortunate, mainly due to my colleague Martin Hawskey that I have a few analytics tricks that I was able to utilise which gave me a bit of an insight into my, and the whole class activity.
One of Martins’ most useful items in his bag of tricks is his hashtag twitter archive. By using his template, you can create an archive in google docs which stores tweets and through a bit of social network analysis magic also gives an overview of activity – top tweeters, time analysis etc. It’s hard to get the whole sheet into a screen grab hopefully the one below gives you and idea. Follow the link and click on the “dashboard” tab to see more details.
From this archive you can also use another one of Martin’s templates to create a vizualisation of the interactions of this #hashtag network.
Which always looks impressive, and does give you a sense of the “massive” part of a MOOC, but it is quite hard to actually make real any sense of;-)
However Martin is not one to rest on his SNA/data science laurels and his latest addition, a searchable twitter archive, I feel was much more useful from a learner’s (and actually instructors) perspective.
Again it has time/level of tweets information, this time clearly presented at the top of the sheet. You can search by key word and/or twitter handle. A really useful way to find those tweets you forgot to favourite! Again here is a screenshot just as a taster, but try it out to get the full sense of it.
Also from an instructor/course design point of view you, from both of these templates you can see time patterns emerging which could be very useful for a number of reasons – not least managing your own time and knowing when to interact to connect with the most number of learners.
Another related point about timing relates to the use of free services such as storify. Despite us all being “self directed, and motivated” it’s highly likely that if an assignment is due in at 6pm – then at 5.50 the service is going to be pretty overloaded. Now this might not be a problem, but it could be and so it worth bearing in mind when designing courses and suggesting submission times and guidance for students.
I also made a concerted effort to blog each day about my experiences, and once I was able to use another one of Martin’s templates – social sharing, to track the sharing of my blogs on various sites. I don’t have a huge blog readership but I was pleased to see that I was getting a few more people reading my posts. But what was really encouraging (as any blogger knows) was the fact that I was getting comments. I know I don’t need any software to let me know that, and in terms of engagement and participation, getting comments is really motivating. What is nice about this template is that it stores the comments and the number other shares (and where they are), allowing you get more of an idea of where and how your community are sharing resources. I could see my new #moocmooc community were engaging with my engagement – warm, cosy feelings all round!
So through some easy to set up and share templates I’ve been able to get a bit more of an insight into my activity, engagement and participation. MOOCs can be overwhelming, chaotic, disconcerting, and give learners many anxieties about being unconnected in the vast swirl of connectedness. A few analtyics can help ease some of these anxieties, or at least give another set of tools to help make sense, catch up, reflect on what is happening.
For more thoughts on my experiences of the week you can read my other posts.