Institutional Readiness for Analytics – practice and policy

So far in our Analytics Series we have been setting out the background, history and context of analytics in education at fairly broad and high levels. Developing policy and getting strategic buy-in is critical for any successful project (analytics based or not), so we have tried to highlight issues which will be of use to senior management in terms of the broader context and value of analytics approaches.

Simon Buckingham Schum at the OU (a key figure in the world of learning analytics) has also just produced Learning Analytics Policy Brief for the UNESCO Institute for Information Technologies in Education. Specifically focussing on learning analytics Simon’s paper highlights a number of key issues around “the limits of computational modelling, the ethics of analytics, and the educational paradigms that learning analytics promote”, and is another welcome addition to the growing literature on learning analytics; and is a useful complementary resource to to the CETIS series. I would recommend it to anyone interested in this area.

Moving from the policy to practicalities is the focus of our next paper, Institutional Readiness for Analytics. Written by Stephen Powell (with a little bit of input from me), this paper drills down from policy level decisions to the more pragmatic issues faced by staff in institutions who want to start to make some sense of their data through analytics based techniques. It presents two short cases studies (from the University of Bolton and the Open University) outlining the different approaches each institution has taken to try and make more sense of the data they have access to and how that can begin to make an impact on key decisions around teaching, learning and administrative processes.

The OU is probably slightly “ahead of the game” in terms of data collection and provisioning and so their case study focuses more on staff development issues through their Data Wrangler Project, whereas the University of Bolton case study looks more at how they are approaching data provisioning issues. As the paper states, although the two approaches are very different “they should be considered as interrelated with each informing the work of the other in a process of experimentation leading to the development of practices and techniques that meet the needs of the organisation.”

As ever if you have thoughts or any experiences of using analytics approaches in your institution, we’d love to hear from you in the comments.

The paper is available for download here, and the other papers in the series from are available here.

Exploring Digital Futures

One of the most enjoyable aspects of the programme support aspect of my job is that I get to find out about a lot of really innovative work taking place across a diverse range of UK universities. On the flip side of this, I do sometimes yearn to be part of the development of projects instead of always just being on the outside looking in once plans have been made and funding secured. I also often wonder if anything I write about in my blog does actually make any difference or is useful to the wider to community.

So I was delighted yesterday to spend the afternoon at Edinburgh Napier University at an internal seminar exploring their digital future and technological ambitions. I was even more delighted a couple of weeks ago when Keith Smyth contacted me about attending the event, and said that the series of blog posts I wrote with my Strathclyde colleague Bill Johnston on the Digital University, had been really useful and timely for Napier in terms of them starting to think about how to develop their approach to a digital strategy.

Yesterday’s seminar was an opportunity for staff from across the institution to come together and share their experiences and views on what their real needs and aspirations are in terms of the future (digital) shape of the university. Napier are already involved in a number of innovative projects internally, and are committed to open practice, particularly in regards to their work in learning technology. For example their 3E Framework for effective use of technology in teaching and learning, is available via a CC licence and is being used/adapted by over 20 institutions worldwide who have all agreed to share their adaptations. A great example of how open practice can not only improve internal working practices but also have an impact in terms of helping community knowledge grow in an open, shareable way too. The framework is also linked to a resource bank,with examples of the framework in action, which again is openly available.

Like many institutions, podcasting is a growing trend and their College2Uni podcasts which were originally designed to help student transition from college to university are now being used for wider community driven information sharing initiatives. Plans for an open access journal are also well underway.

But what/where next? What should the long, medium and short term goals for the institution be? Participants were asked to consider “what will today’s ten year old’s expect when they come to University in 2020?” Delegates were divided into six groups set short (i.e. can be in place in a year) as well as longer term aspirational goals. The six themes were:

*Developing digital literacies
*Digital equivalence
*Digitally enhanced education
*Digital communication and outreach
*Digital scholarship
*Digital infrastructure and integration

Again, another wee ego boost, was seeing how the matrix Bill and I have developed, provided a framework for the discussions and planning of the workshop.

MacNeill/Johnston conceptual matrix (revised, October 2012)
MacNeill/Johnston conceptual matrix (revised, October 2012)

It was also a good opportunity for me to highlight work from a number of JISC programmes including Developing Digital Literacies, Assessment and Feedback, and Curriculum Design and Delivery and the growing number of resources from all these programmes which are available from the Design Studio.

There was a genuine enthusiasm from all the delegates a number of suggestions for easily achievable short term goals including single sign on for all uni accounts, more co-ordinated and easily accessible communication channels (for staff and students), experimenting with lay out of lecture spaces, developing a more coherent strategy for mobile devices. Longer term goals were generally centred on ubiquitous access to information, continuous development of staff and student skills including supporting open practices, ways to differentiate Napier and how to take advantage of affordances of the all pervasive MOOCs and indeed the changing landscape of HE. Content maybe more plentiful in 2020 but not everyone has the skills to take an MIT/Stanford/Everyotherbignameuniversity open course without support. There are a lot of skills which we know employers are looking for which aren’t supported through these large scale distance models of education. The need for new spaces (both digital and physical) for experimentation and play for both staff and students was highlighted as a key way to support innovation. You can get a flavour of the discussion by searching the #digiednap archive.

The next steps for Napier, are the forming of working group to take forward the most popular ideas from the session. There was a bit of the old “dotmocracy” with delegates voting for their preferred short terms ideas:

and work on more strategic developments over the coming year. I am really looking forward to working with colleagues in Napier as a critical friend to these developments, and being part of a project from the outset and seeing first hand how it develops.

UK SoLAR meeting feedback

Last month in collaboration with the colleagues at the OU we co-hosted the inaugural UK SoLAR Flare. A number of blogs, pictures and videos of the day are available on the SoLAR website.

This was the first meeting in the UK focusing on learning analytics and as such we had quite a broad cross section of attendees. We’ve issued a small survey to get some feedback from delegates, and many thanks to all the attendees who completed it. We had 20 responses in total and you can access collated results of the survey from here.

Overall, 100% of respondents found the day either very useful or useful, which is always a good sign, and bodes well for the beginnings of a new community of practice and future meetings

The need for staff development and a range of new skills is something that is being increasingly identified for successful analytics projects and is an underlying theme our current Analtyics Series. The role of the Data Scientist is being increasingly recognised as a key role both in the “real” world and in academia. So what roles did our attendees have? Well, we did have one data scientist, but perhaps not that surprisingly the most common role was that of Learning Technologist with 5 people. The full results were as follows:

learning technologist 5
manager 3
lecturer 3
developer 3
researcher 3
data scientist 1
other 2
(other answers; “director/agile manager” “sort of learning technologist but also training”

So a fair spread of roles which again bodes well for the development of teams with the skill needed to develop successful analytics projects.

We also asked attendees to share the main idea that they took away from the day. Below is a selection of responses.

“That people are in the early stages of discussion.”

“Learning analytics needs to reach out end-users”

“The overall idea was how many people are in the same position and that the field is in a very experimental stage. This improves the motivation to be experimental.”

“more a better understanding of the current status than a particular idea. But if I had to chose one idea it is the importance of engaging students in the process.”

“Early thoughts on how learning analytics could be used in the development of teaching staff.”

“That HE is on the cusp of something very exciting and possibly very enlightening regarding understanding the way students learn. BUT the institution as a whole needs to be commited to the process, and that meaningful analysis of the mass of potential data that is ‘out there’, is going to be critical. There is also the very important issues of ethics and who is going to do what with the data………I could go on, and on, and on…….”

Suggestions for further meetings included:

“It would be great to involve more academic teaching staff and students in future meetings.”

“I think bringing together the different stakeholders (technologists, teachers, students, data scientists, statisticians) is a great feature for this group. It is easy to break into silos and forget the real end-user. Having more students involved would be great.”

“An international project exchange. Have, say, 10 – 15 lightning talks. Then organise a poster session with posters corresponding to the lightning talks. People whose interest was drawn by one project or another will have the chance to follow up on that project for further information. Also maybe an expert panel (with people that have experience with putting learning analytics into educational practice) that can answer questions sent in beforehand by people wanting to set up a learning analytics project/activity. This can also be done Virtually”

“Would really welcome the opportunity to have a ‘hands on’ session possibly focussing upon the various dashboards that are out there.”

You can access the full results at the SoLAR website.

Analytics for Understanding Research

After a bit of exploration of the history, meanings and definitions of analytics from Adam Cooper, today our Analytics Series continues with the Analytics for Understanding Research paper (by Mark Van Harmelen).

Research and research management are key concerns for Higher Education, and indeed the wider economy. The sector needs to to ensure it is developing, managing, and sharing research capacity, capabilities, reputation and impact as effectively and efficiently as possible.

The use of analytics platforms has the potential to impact all aspects of research practice from the individual researcher in sharing and measuring their performance, to institutional management and planning of research projects, to funders in terms of decision making about funding areas.

The “Analytics for Understanding Research” paper focuses on analytics as applied to “the process of research, to research results and to the measurement of research.” The paper highlights exemplar systems, metrics and analytic techniques backed by evidence in academic research, the challenges in using them and future directions for research. It points to the need for the support and development of high quality, timely data for researchers to experiment with in terms of measuring and sharing their reputation and impact, and the wider adoption of platforms which utilise publicly available (and funded) data to inform and justify research investment.

Some key risks involved in the use of analytics to understand research highlighted in the paper are:

*Use of bibliometric indicators as the sole measure of research impact or over-reliance on metrics without any understanding of the context and nature of the research.
*Lack of understanding of analytics and advantages and disadvantages of different indicators on the part of users of those indicators. Managers and decision makers may lack the background needed to interpret existing analytics sensitively.
*The suitability of target-based assessment based on analytics is unproven. A wider assessment approach was tentatively recommended above (in most detail on page 29).
*There is a danger of one or a few vendors supplying systems that impose a particular view of analytics on research management data.

However it also points to some key opportunities including:

*Access to high-quality timely analytics may enable professionals to gauge their short-term performance, and use experimentation to discover new and novel ways to boost their impact.
*Adoption of CERIF-based CRIS across UK HE institutions and research institutes, with automatic retrieval of public data by UK Research Councils may help motivate increases in public funding of scientific and other scholarly activity; vitally important to the UK economy and national economic growth.
*Training as to the advantages, limitations and applicability of analytics may assist in the effective use of analytics its lay users, including researchers, research managers, and those responsible for policy and direction in institutions and beyond.

As ever, if you have any thoughts or experiences you’d like to share, please do so in the comments.

The paper is available to download here .

The papers published to date in the series are all available here.

css.php