Legal, Risk and Ethical Aspects of Analytics in Education

After some initial feedback on the CETIS Analytics Series, we’ve had a wee re-think of our publication schedule and today we launch the “Legal, Risk and Ethcial Aspects of Analytics in Education” written by David Kay (Sero Consulting), Naomi Korn and Professor Charles Oppenheim.

As all researchers are only too well aware, any practice involving data collection and reuse has inherent ethical and legal implications of which institutions must be cognisant. Most institutions have guidelines and policies in place for the collection and use of research data in place. However, the gathering of usage data primarily from internal systems, is an area where it is less commonplace for institutions to have legal and ethical guidelines in place. As with a number of developments in technology, current laws have not developed at a similar pace.

The “Legal, Risk and Ethical Aspects of Analytics in Higher Education” paper provides a concise overview of legal and ethical concerns in relation to analytics in education. It outlines a number of legal actors which impinge on analytics for education, in particular:

* Data Protection
* Confidentiality & Consent
* Freedom of Information
* Intellectual Property Rights
* Licensing for Reuse.

The paper also recommends a set of common principles which have universal application.

*Clarity; open definition of purpose, scope and boundaries, even if that is broad and in some respects extent open-ended,

*Comfort & care; consideration for both the interests and the feelings of the data subject and vigilance regarding exceptional cases,

*Choice & consent; informed individual opportunity to opt-out or opt-in,

*Consequence & complaint; recognition that there may be unforeseen consequences and therefore provision of mechanisms for redress.

Being aware of the legal and ethical implications of any activity requiring data collection is fundamental before undertaking any form of data analysis activity, and we hope this paper will be of use in helping inform and develop practice. As ever, if you have any comments/ examples please use the comments section to share them with us.

The paper is available to download here.

The papers published so far in the series are:

*Analytics, What is Changing and Why does it Matter?
*Analytics for the Whole Institution; Balancing Strategy and Tactics
*Analytics for Learning and Teaching

Analytics for Teaching and Learning

It’s all been about learning analytics for me this week. Following the SoLAR UK meeting on Monday, I’m delighted to announced that next paper in the CETIS Analytics Series, “Analytics for Teaching and Learning” launches today.

Building on from “Analytics for the Whole Institution, balancing strategy and tactics“, this paper (written by Mark Van Harmelen and David Workman) takes a more in-depth look at issues specifically related to applying analytics in teaching and learning.

The Analytics for Teaching and Learning paper examines:

” the use of analytics in education with a bias towards providing information that may help decision makers in thinking about analytics in their institutions. Our focus is pragmatic in providing a guide for this purpose: we concentrate on illustrating uses of analytics in education and on the process of adoption, including a short guide to risks associated with analytics.”

Learning analytics is an emerging field of research and holds many promises of improving engagement and learning. I’ve been following developments with interest and I hope a healthy level of scepticism and optimism. A number of VLEs (or LMSs if you’re in North America) are now shipping with built in analytics features aka dashboards. However, as I pointed out in the “Analytics, what is changing and why does it matter?” paper, there really isn’t a “magic analytics” button which will suddenly create instantly engaged students and better results. Effective use and sense making of any data requires lots of considerations. You need to think very carefully about the question(s) you want the data help you to answer and then ensure that results are shared with staff and students in ways that allow them to gain “actionable insights”. Inevitably the more data you gather, the more questions you will ask. As Adam summarised in his “how to do analytics right” post a simple start can be best. This view was echoed at discussions during the SoLAR meeting on Monday.

Starting at small scale, developing teams, sharing data in meaningful ways, developing staff/student skills and literacies are all crucial to successful analytics projects. The need for people with both data handling, interpretation and within education, pedagogic understanding is becoming more apparent. As the paper points out,

“There are a variety of success factors for analytics adoption. Many of them are more human and organisational in nature than technical. Leadership and organisational culture and skills matter a lot.”

Again if you have any thoughts/experiences to share, please feel free to leave a comment here.

The paper can be downloaded from here.

JISC Curriculum Design Programme Synthesis report now available

For the past four years I’ve been part of the support team for the JISC Curriculum Design Programme, and it has been a fascinating journey for everyone involved and has provided the basis for many a blog post here.  The final synthesis report for the programme is now available from the Design Studio.

Making sense of the varied findings of 12 projects over nearly 4 years is no mean feat, but Helen Beetham (with support from the rest of the team particularly Gill Ferrell, Marianne Sheppard and a little bit from me) has done a fantastic job.  The report reviews the four main areas of investigation: improving curriculum processes, reforming course information, enhancing design practice and transforming organisations. 

The main conclusions are:

*More transparent processes with shared, accessible representations of the curriculum can support better stakeholder engagement in curriculum design
*More efficient processes can save considerable administrative staff time, and may free up curriculum teams to focus on educational rather than administrative concerns
*A focus on the design process rather than its outcomes allows both for lighter-weight approval events and a shorter review cycle with more opportunity for continuous enhancement
*A single, trusted source of course information can be achieved through a centralised academic database, but similar benefits can be gained through enhancing the functions, interfaces and interoperability of existing systems.
*Trusted, relevant, timely information can support educational decision making by curriculum teams.
*Better managed course information also has benefits for students in terms of course/module selection, access to up-to-date information, and parity of experience
*Better managed information allows institutions to analyse the performance of their course portfolio as well as meeting external reporting requirements.
*Curriculum design practices can be enhanced through face-to-face workshops with access to resources and guidance.
*Particularly effective resources include concise statements of educational principle with brief examples; and tools/resources for visualising the learning process, e.g. as a storyboard or timeline, or as a balance of learning/assessment activities.
*With better quality guidance and information available, curriculum teams can build credible benefit/business cases and respond more effectively to organisational priorities.
 
I would thoroughly recommend reading the the full report to anyone who is involved in any kind of curriculum design activity.  

The report does signify the end of the programme, but plans are in place to ensure that the lessons learnt continue to be shared with the wider community. A number of openly available resources from the programme will be released over the coming months, including an info-kit style resource looking at business processes and curriculum information, and a resource pack including a number of tools and techniques developed by the projects for course development.

The Design Studio itself continues to grow with inputs from the Assessment and Feedback and Developing Digital Literacies Programmes. 

Quick links from SoLAR Flare meeting

So we lit the UK SoLAR Flare in Milton Keynes yesterday, and I think it is going to burn brightly for some time. This post is just a quick round up of some links to discussions/blogs/tweets and pics produced over the day.

Overviews of the presentations and discussions were captured by some live blogging from Myles Danson (JISC Programme Manager for our Analytics Series)

and Doug (master of the live blog) Clow of the OU.

Great overview of the day – thanks guys!

And our course we have some twitter analytics thanks to our very own Martin Hawksey’s TAGs archive for #FlareUK and the obligitory network diagram of the twitter stream (click the image to see larger, interactive version)

#FlareUK hashtag user community network

Slides from the morning presentations and subsequent group discussions are available from the the SoLAR website, and videos of the morning presentations will be available from there soon too.

As a taster of the day – here’s a little video of what went on.

Analytics for the Whole Institution; Balancing Strategy and Tactics

Following on from last week’s introductory and overview briefing paper, Analytics, what is changing and why does it matter?, this week we start to publish the rest of our series, beginning with “Analytics for the Whole Institution; Balancing Strategy and Tactics” (by David Kay and Mark van Harmelen)

Institutional data collection and analysis is not new to institutions, and most Higher Education Institutions and Further Education Colleges do routinely utilise collect data for a range of purposes, and many are using Business Intelligence (BI) as part of their IT infrastructure.

This paper takes an in-depth look as some of the issues which “pose questions about how business intelligence and the science of analytics should be put to use in customer facing enterprises”.

The focus is not on specific technologies, rather on how best to act upon the potential of analytics and new ways of thinking about collecting, sharing and reusing data to enable high value gains in terms of business objectives across an organisation.

There a number of additional considerations when trying to align BI solutions with some of the newer approaches now available for applying analytics across an organisiation.  For example, it is not uncommon for there to be a disconnect between gathering data from centrally managed systems and specific teaching and learning systems such as VLEs. So at a strategic level, decisions need to be taken about overall data management, sharing and re-use e.g. what sytems hold the most useful/ valuable data? What formats is avaiable in? Who has access to the data and how can it be used to develop actional insights? To paraphrase from a presentation I gave with my colleague Adam Cooper last week “how data ready and capabile is your organisation?”, both in terms of people and systems.

As well as data considerations, policies (both internally and externally) need to be developed in terms of ethical use of data, and also in terms of developing staff and the wider organisational culture to developed data informed practices. Of course, part of the answers to these issues lie in sharing in the sharing and development of practice through organisations suchs as JISC. The paper highlights a number of examples of JISC funded projects.  

Although the paper concentrates mainly on HEIs, many of the same considerations are relevant to the Further Education colleges. Again we see this paper as a step in widening participation and identifying areas for further work. 

At an overview level the paper aims to:

*Characterise the educational data ecosystem, taking account of both institutional and individual needs
*Recognise the range of stakeholders and actors – institutions, services (including shared above-campus and contracted out), agencies, vendors
*Balance strategic policy approaches with tactical advances
*Highlight data that may or may not be collected
*Identify opportunities, issues and concerns arising

As ever we’d welcome feedback on any of the issues raised in the paper, and sharing of any experiences and thoughts in the comments.

The paper is available to download from here.

Analytics, what is changing and why does it matter?

A couple of tricky questions in that title, but hopefully some answers are are provided in a series of papers we are launching today.

The CETIS Analytics Series consists of 11 papers, written by a range of our staff (and some commissioned pieces) looking at a range of topics relevant to Analytics in education. The series is intended to provide a broad landscape of the history, context, issues and technologies of Analytics in post 16 education, and in particular the UK context.

As this diagram below illustrates, the series covers four main areas: “big issues” which consists of in depth reports on issues relating to the whole institution including ethical and legal, learning and teaching, research management; “history and context” which looks at the history and development of analytics in more generally; “practice” which looks some of the issues around implementing analytics particularly in HE institutions; and “technology” which reviews a number of technologies and tools available just now.

The Cetis Analytics Series Graphic
(click graphic to see larger image)

The series provides a background, critique and pointers to current and future developments to help managers and early adopters develop their thinking and practice around the use of analytics. As Adam Cooper highlights

“Analytics is the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data.”

We hope that the papers will help people in developing processes to not only identify actionable insights, but also how to develop processes, and more importantly, the staff/student skills and literacies, to produce measurable impacts across the range of activities undertaken in educational organisations such as universities and colleges. As Nate Silver demonstrated in the recent US election, it’s not just about having the data, it’s being able make sense of it and communicate findings effectively that makes the difference.

Given the that this is a rapidly developing field, it is impossible to cover every everything, but we hope that the papers will provide a solid basis for discussion and pointers for further work. Of course as well as the papers, we continue to report on our work and thoughts around data and analytics. For example, over the past month or so, Sharon Perry has been summarising a number of significant outputs and findings from the JISC Relationship Management Programme on her blog. Next week we co-host the inaugural UK SoLAR Flare with colleges from the OU (UK) which will provide another opportunity to help identify key areas for further research and collaboration.

We’ll be publishing the papers between now and early January, and each will have an accompanying blog post providing bit more context for each and the opportunity for feedback and discussion. Below is a list of titles with the week of its publication.

* Analytics for the Whole Institution; Balancing Strategy and Tactics (19th November)
* Analytics for Learning and Teaching (22 November)
* Analytics for Understanding Research (22 November)
* What is Analytics? Definition and Essential Characteristics (4 December)
* Legal, Risk and Ethical Aspects of Analytics in Higher Education (4 December)
* A Framework of Characteristics for Analytics (18 December)
* Institutional Readiness for Analytics (19 December)
* A Brief History of Analytics (8 January)
* The Implications of Analytics for Teaching Practice in Higher Education (8 January)
* Infrastructure and Tools for Analytics (15 January)

Today we start with an overview briefing paper which provides and overview and sets the context for the series. You can download the paper from the link below.

*Analytics, what is changing and why does it matter ? briefing paper.

Open Architectures – solving more interesting problems

The JISC Innovating e-Learning online conference is just a couple of weeks away, and this year I’m particularly glad that this is an online conference and I can catch up with sessions via the recordings and join the discussion via the online forums. Typically, the Open Architectures – solving more interesting problems session I was involved in developing with Rob Englebright and Lou McGill clashes with the SoLAR Flare meeting were co-hosting with the OU.

The background to the session goes something like this . . .

JISC has a long standing tradition of supporting open approaches, from software to educational resources. Part of that support is routed in notions of allowing greater empowerment for users to adapt technology and systems to suit their teaching and learning needs. Many teachers, learners and VLE administrators have been frustrated by the lack of flexibility and opportunities for customisation and personalisation in VLEs. However in over the last few years, there have been a number of developments which are allowing far more flexible and open approaches to be taken.

Back in 2010 myself and Wilbert Kraan produced the Distributed Learning Environments Briefing Paper. This outlined five potential models for the opening up and integration of VLEs with a number of other administrative systems and the wider social web and allowing increasingly flexible access to VLEs from mobile devices. The briefing provided the background for the JISC Distributed Virtual Learning Environments Programme in which 8 projects explored a variety of approaches to extending their learning environments. The Extending the Learning Environment briefing paper provides an overview of the highlights of the programme including the development of WAGs (widgets, gadets and apps) and the use of the IMS LTI specification. It also illustrates how flexible approaches can support innovative teachers to experiment with a range of tools quickly and at low cost. A set of more detailed case studies from the projects are also now available.

Earlier this year, at Dev8ed a number of the sessions were based on the work of the DVLE programme, particularly the use and adoption of IMS LTI as a way to integrate new tools into VLEs. A number of conversations centered around the shift in VLEs. There were a number of discussions and examples of how VLEs were actually becoming more of a development platform, allowing developers and teachers to create customised solutions for their specific needs. The conference session builds on these initial conversations.

Mark Johnson, Scott Wilson and Martin Hamilton will look at some of the popular solutions that underpin teaching technology, and how these solutions prescribe to some extent the learning journey. Whilst we often say pedagogy leads all our decisions, the decisions we make about our infrastructure often determine what is possible. So if you are interested in how you can solve some more interesting problems with your VLE, sign up and join the conversation and see what interesting problems we can help solve together.

Martin Hamilton provides a taster for the session in this short preview video.