Time for Analytics of the Oppressed? – my starter for 10 for #digifest debate

Analytics of the Oppressed(1)

I have been asked to step into the breech so to speak for the learning analytics interventions should always be mediated by a human debate later this week at Digifest.

The structure for the debate is as follows:

The machine will argue they can use learning analytics to provide timely and effective interventions to students improving their chances of achieving better qualifications. Machines don’t forget or get sick; learning analytics is more accurate and not prejudiced; evidence for automated interventions.

The human will argue although machines can make predictions they will never be 100% accurate; only a person can factor personal circumstances; automated interventions could be demotivating; automated interventions are not ethical.

Fortunately for me I have been given the human side of the debate.  Unfortunately for the organisers,  Leanne Etheridge is no longer able to attend.  Leanne, I will do my best.

Preparation for the debate has started already with this blog post from  Richard Palme aka “the opposition”.  In order for me to get my thoughts into some kind of order for Wednesday morning’s debate,  I’m going to try and outline my reactions to the provocations outlined in the post by my learned colleague

Richard has outline three key areas where he believes there is increased potential for data driven system interventions.

  1. First of all, humans have a long history of believing that when certain things have always been done in one way, they should stay that way, far beyond the point where they need to be. . .  .If you look at Luddite rebellions, we thought that it should always be a human being who stretched wool over looms and now everyone agrees that’s an outdated concept. So, deciding that something needs to be done by a human because it always has been done by a human seems, at best, misguided.  

2. Secondly, people object that the technology isn’t good enough. That may, possibly, be the case right now but it is unlikely to be the case in the future. . . Technologies will improve. Learning analytics will become more advanced. The data that we hold about our students will become more predictive, the predictions we make will be better and at some point institutions will decide where their cost benefit line is and whether everything does have to be human-mediated.

3. Thirdly, how good do we actually think people are? Certainly, human beings can empathise and pick up on non-verbal or even non-data-related signals from other people, but when was the last time a computer turned up to work hungover? Or stressed or worried about something – or just didn’t turn up at all?. . . . Will a computer ever be better than the perfect person? Maybe, maybe not. But, let’s face it, people aren’t perfect. . . .We worry about computers sending insensitively worded emails and inappropriate interventions but we all know human beings who are poor communicators, who are just as capable, if not more, of being insensitive.

Where to start?  Well, despite us pesky humans almost falling at the first hurdle of not being able to be there in person – so unreliable!  We can pick up challenge and a thread from  where our colleagues have left off without the need for any additional programming.  I don’t know what Leanne was going to say, but I really like the 2 quotes for the 2 slides she has selected.  (I detect an air of confidence from only 2 slides!)

“ It is the supreme art of the teacher to awaken joy in creative expression and knowledge”  Albert Einstein

“Every student can learn, just not on the same day, or in the same way” George Evans.

Going back to Richard’s post I believe there is a truly  pressing need to challenge this apparently sensible, logical narrative.  The narrative that is being spun around data and analytics is becoming an ever complex web for us to break out of. But break out of it we must!  To paraphrase Paulo Freire  it is time for some critical analytics. It is time to seriously consider the analytics of the oppressed.

Point 1 – On humans “deciding that something needs to be done by a human because it always has been done by a human seems, at best, misguided.” I always worry when the Luddite card gets pulled into play.  The negative connotations that it implies, negates the many, many skilled craftspeople who were actually fighting for their livelihoods, their craft.  Audrey Watters explained this perfectly in her 2014 ALTC keynote Ed Tech Monsters.

“The Luddites sought to protect their livelihoods, and they demanded higher wages in the midst of economic upheaval,”

Sound familiar? It strikes me as uncannily similar to our current union campaigns for fair pay, to stamp out casualisation of academic staff contracts.   But it’s ok because the overriding managerial narrative is that data can help us rationalise, to streamline our processes. It’s been a while since  Friere wrote this, but again it rings true today.

Our advanced technological society is rapidly making objects of us and subtly programming us into conformity to the logic of its system to the degree that this happens, we are also becoming submerged in a new “Culture of Silence”

Point 2 – On technology not being good enough Technologies will improve. Learning analytics will become more advanced. The data that we hold about our students will become more predictive, the predictions we make will be better and at some point institutions will decide where their cost benefit line is and whether everything does have to be human-mediated.

Data about our students will be more predictive? Our predictions will be “better” – better at doing what?  Better at showing us the things we want to see? Getting our student “customers” through their “student success journeys” without any difficult interrogations, without the right to fail?  Or actually stopping someone actually starting/continuing their educational journey because their data isn’t the “right fit”?

The promise of increasing personalisation fits into an overwhelming narrative from ed tech companies that is permeating through governments, funding bodies, University leaders. Personalisation is the future of education. Personalised alerts are the natural progression to student success.  But are they just another form of manipulation? Assuaging the seemingly endless collective need to measure, monitor, fitbit-itize the educational experience?  The words of Fierre again ring true.

One of the methods of manipulation is to inoculate individuals with the bourgeois appetite for personal success. This manipulation is sometimes carried out directly by the elites and sometimes indirectly, through populist leaders.

Point 3 Just how good are people anyway? We don’t turn up, we get ill and we are biased. Well all of those apply to most systems I’ve ever interacted with. Our own biases are intrinsically linked to the systems we develop, to the interpretations of data we chose to accept.  As Fierre said

One cannot conceive of objectivity without subjectivity

I cannot agree that the downside of machine interventions are “no worse that humans doing it badly”. Surely we need to be engaging critically to ensure that no human or machine is doing anything “badly”.

The “system” should not  just be replicating current bad practice.  Data should provide us with new ways to encourage a richer dialogue about education and knowledge. Learning analytics can’t just be a way to develop alerting and intervention systems that provide an illusion of understanding, that acquiesce to not particularly well thought out government driven monitoring processes such as the TEF.

In these days of alternative facts, distrust of expert knowledge, human intervention is more crucial than ever. Human intervention is not just an ethical issue, it’s a moral imperative.   We need to care, our students need to care, our society needs to care. I”ll end now with the words of the Cassandra of EdTech, Audrey Watters

In order to automate education, must we see knowledge in a certain way, as certain: atomistic, programmable, deliverable, hierarchical, fixed, measurable, non-negotiable? In order to automate that knowledge, what happens to care?

If the product works, but what about the people?

This is probably going to be an even more incoherent ramble than normal but I have been trying to write posts around a number of things for the last couple of weeks I’m going to try and merge them.

A couple of weeks ago, I read this post by David Wiley. At the time I tweeted:

I confess to a more than a bit of this sentiment, and not just in relation to OER,   “Much of the OER movement has a bad attitude about platforms.” I am always wary when the focus is on developing platforms and not developing the people who will use these platforms.

I was once in a meeting where I put forward the “people and process not platforms and products” case. I was told that what was being discussed was platform “in the Californian sense of platform”.  . .  I’m sure a classic WTF look must have passed over my face, but it was explained that this meant people as well as technology.  Geography aside, three years later this sense of platform doesn’t seem to be that wide spread or acknowledged. Maybe I need to go to California. But I digress.

Not long before the Wiley post I was reading the Pearson White Paper on learning design.  It caused me a bit of unease too.  Part of me was delighted to see learning design being recognised by, whatever might happen to them, a significant player in the education technology provider field.   Using learning design to help product design is a bit of a no brainer. Technology should be driven by educational need or as Pearson put it :

“Products and systems that effectively leverage learning design can deliver superior learning outcomes.”

One example in the paper referred to work they had done in social science classes

“we quickly recognized that students were easily distracted by conventional textbooks. This told us we needed to eliminate distractions: any extraneous cognitive load that doesn’t promote learning. Fortunately, our learning design work reveals many proven techniques for accomplishing this. REVEL segments all content into manageable pieces and presents it via a consistent structure. It provides strong signaling cues to highlight key material and places all relevant content on screen simultaneously to offer a continuous, uninterrupted experience”

Which kind of related to this point from the Wiley post:

“Our fixation on discovery and assembly also distracts us from other serious platform needs – like platforms for the collaborative development of OER and open assessments (assessments are the lifeblood of this new generation of platforms), where faculty and students can work together to create and update the core materials that support learning in our institutions. Our work in OER will never be truly sustainable until faculty and students jointly own this process, and that can’t happen until a new category of tools emerges that enables and supports this critical work. (Grant money for OER creation won’t last forever.)

And don’t even start trying to explain how the LMS is the answer. Just don’t. “

Well of course Pearson do try to explain that:

“As testing progresses, we can overcome problems that compromise outcomes and build a strong case that our design will support learning. The very same work also helps us tightly define assessments to find out if the product works in real classrooms”

Of course they don’t really touch on the OER aspect (all their learning design stuff has been made available with CC goodness) but I’ll come back to that.

That phrase “if the product works”, I keep coming back to that.  So on the one hand I have to be pleased that Pearson are recognising learning design. I have no argument with their core principles .  I agree with them all.  But I am still left with the niggle around the  assumption that the platform will “do” all the learning design  for both staff and students. That underlying  assumption that if only we had the right platform all would be well, everything could be personalised, through data and analytics and we’d have no retention issues.  That niggles me.

I was part of a plenary panel at the HESPA conference last week called “the future of learner analytics” where a number of these issues came up again.   The questions asked by this group of educational planners really stimulated a lot of debate. On reflection I was maybe a bit of a broken record.  I kept coming back not to platforms but people and more importantly time.  We really need to give our staff and students (but particularly our staff) time to engage with learning analytics.   Alongside the technical infrastructure for learning analytics we need to asking where’s the CPD planning for analytics?  They need to go hand in hand. Cathy Gunn, Jenny McDonald and John Milne’s excellent paper “the missing link for learning from analytics” sums this up perfectly:

there is a pressing need to add professional development and strategies to engage teachers to growing range of learning analytics initiatives If these areas are not addressed, adoption of the quality systems and tools that are currently available or underdevelopment may remain in the domain of the researchers and data analysis experts” 

There seems to be an assumption that personalisation of learning is a “good thing” but is it?  Going back to learning design, designing engaging learning activities is probably more worthwhile and ultimately more useful to students and society than trying to create homogenised, personalised chunked up content and assessments.  Designing to create more effective engagement with assessment and feedback is, imho, always going to be more effective than trying to design the perfect assessment platform.

In terms of assessment, early last week I was also at a Scotbug (our regional Blackboard user group) meeting, where I was in a group where we had to design an assessment system. This is what we came up with – the flipped assessment – aka student generated assessments.

img_0107

Not new, but based on pedagogy and technology that is already in use ( NB there’s been a really great discussion around some of this in the ALT list this weekend).   I don’t think we need any new platforms for this type of approach to assessment and feedback – but we do need to think about learning design (which encapsulates assessment design) more, and give more time for CPD for staff to engage more with the design process and the technologies they either have to,  use or want to use.  This of course all relates to digital capability and capacity building.

So whilst  we’re thinking about next gen platforms, learning environments, please let’s not forget people. Let’s keep pressing for time for staff CPD to allow the culture shifts to happen around understand the value of OER, of sharing, of taking time to engage with learning design and not just having to tweak modules when there’s a bit of down time.

People are the most important part of any  learning environment – next gen, this gen, past gen. But people need time to evolve too, we can’t forget them or try to design out the need for them for successful learning and teaching to take place. Ultimately it’s people that will make the product work.

Clawing my way up through the trough of disillusionment with learning analytics

512px-Gartner_Hype_Cycle.svg

(image: Jeremykemp at English Wikipedia [GFDL (http://www.gnu.org/copyleft/fdl.html) or CC BY-SA 3.0 (http://creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons)

Warning -this is a bit of a moan post.

Last week I attended the Jisc Learning Analytics Network meeting. It was a really good day, lots of people there, lots of good sharing, moaning, asking where next-ing.  One of the reasons I find these events useful is that they help focus my mind and give me a sense of relief that some of the challenges that I face are similar, if not exactly the same, as many others in the sector.

In terms of learning analytics, my experiences to date have been metaphor-tastic: (ever decreasing) circles, slopes, dead ends, stop-starts . . . I feel that it’s appropriate reflect on my journey via the well trodden Gartner hype cycle.

I’m the first to admit I enjoyed being swept up to the peak of inflated expectations. Exploring the potential of data and learning analytics was probably the last piece of innovation work I was involved in when I work with Cetis. I really enjoyed trying to figure out the practical applications and meanings for mainstream learning and teaching of the swirly twirly graphs at early LAK conferences. It was great to support the emerging UK community via early SoLAR meeting.  I learnt a huge amount being involved in the Cetis Analytics Series.  I always think I brought a  healthy degree of scepticism to some of the hype of learning analytics, but I could  (and still can) see the benefits of extracting, exploring and understanding data around learning and teaching.

From the giddy heights of the peak of inflated expectation, I knew when I moved to a “proper job” within a university I would have a bit of a slide down the slope to the trough of disillusionment. It’s getting out of the trough that I’m finding real difficulty with. Changes in senior management, have meant going through a bit of a treadmill in terms of gaining institutional support and understanding. That’s before even accessing any data.

The Jisc Effective Analytics Programme has been a bit of ray of light and hope for me. Towards the end of last year we took part in the Discovery phase of the programme. This involved a consultancy exercise, onsite for 3 days with a cross section of institutional stakeholders to assess our “readiness” for analytics. At the end of the exercise we got a report with our readiness matrix and some recommendations.  You can view our report here.

At the meeting last week a number of institutions who have gone through the Discovery phase took part in a panel discussion about the experience.  One common thread was the reassurance that the exercise gave to everyone in terms of being “on the right track” with things.  I was pleasantly surprised that we got such good score in terms of our cultural readiness. The validation of having an external report from a nationally recognised agency such as Jisc is also incredibly useful for those of us on the ground to remind/cajole (hit people of the head – oh wait that’s only in my dreams) with in terms of what we should be doing next.

I think one of the main problems with analytics is finding a starting point. Going through the Discovery phase does give a number of starting points. My frustration just now is that my institution is now going through a major rethink of our overall data architecture. So on the one hand I think “hurrah” because that does need to be done. On the other I feel that I am almost back to square one as terms of “business needs” anything to do with learning and teaching seems to fall off the list of things that need to be done pretty quickly.  It’s difficult to juggle priorities, what is more important, getting our admissions process working more efficiently or developing ways to understand what happens when students are engaging (or not) with modules and the rest of the “stuff” that happens at University? Or updating our student record system, or updating our finance systems?

Amidst all this it was good to get a day out to find out what others are up to in the sector. Thanks Jisc for providing these networking events. They really are so useful for the sector and long may they continue. UEL who hosted the event have been doing some great work over the past four years around learning analytics which has emerged from their original BI work with Jisc. The work they have been doing around module attendance (via their swipe card system and VLE data) and performance is something I hope we can do here at GCU sometime soon.

In the morning we got updates from 3 mini projects just have funded starting with the University of Greenwich and their investigations into module survey results and learning outcomes. The team explain more in this blog post. I was also very interested in the Student workload model mini project being developed at the OU.  You can read more about it here.

The other mini project from the University of Edinburgh, was interesting too, but in a different way. It is more what I would term, a pure LA research project with lots of text data mining, regression modelling of (MOOC) discussion forums. Part of me is fascinated by all of this “clever stuff”, but equally part of me just thinks that I will never be able to use any of that in my day job.  We don’t have huge discussion forums, in fact we are seeing (and in many ways encouraging) less use of them (even with our limited data views I know that) and more use of wikis and blogs for reflection and discussion. Maybe these techniques will work on these areas too, I hope so but sometimes thinking about that really does make my head hurt.

I hope that we can start moving on our pilot work around learning analytics soon. ’Til then, I will hang on in there and continue my slow climb up the slope, and maby one day arrive at the plateau.

Looking in the mirror to discover our institutional capability for learning analytics

picture of a mirror

(image CC Share Alike https://commons.wikimedia.org/wiki/File:Mirror_fretwork_english_looking-glass.png)

It’s been a busy week here at GCU Blended Learning Towers.  We’ve just finished the onsite part of of the Jisc Effective Analytics Programme. So this week has been a flurry of workshops and interviews led by the consulting team of Andy Ramsden and Steve Bailey. Although Andy and Steve work for Blackboard, the discovery phase is “platform agnostic” and is as much about culture and people as technology.  The evaluation rubric used had more about culture and people than technology.  Having a team who really understand the UK HE sector was very reassuring. Sadly, it’s not often that you can say that about and HE.

I think GCU is the second institution to go through the discovery process, I know there are quite a few others who will be  doing the same over the next six months. The process is pretty straightforward and outlined in the diagram below.

discovery process diagram

A core team from the institution have a two online meetings with the consulting team, relevant institutional policy/strategy documentation is reviewed before the onsite visit. At the end of the onsite visit an overall recommendation is shared with early findings, before a final report is given to the institution.

I was pleased (probably slightly relieved too) that we got a “ready with recommendations”.  That’s what we were hoping for.

Although we are still awaiting the final report, the process has already been incredibly useful. It has allowed us to bring together some of our key stakeholders; (re)start conversations about the potential and importance of learning analytics; the need to develop our infrastructure, people and process to allow us to use our data more effectively. The final report will also be really helpful in terms of helping us focus our next steps.

Andy described the process as a bit like “holding a mirror to ourselves” which is pretty accurate.  The process hasn’t brought up issues we weren’t aware of. We know our underlying IT infrastructure needs “sorting”, we starting to do that. What is has done is to illustrate some potential areas to help us focus our next steps. In a sense it has helped us not to see forest from the trees, but rather show some twinkling lights and pathways through the forest.

"When you hear the term learning analytics what comes to mind?

This was the question used to prompt the first piece of feedback in the opening workshop of a three day consultation to assess our readiness for analytics as part of the discovery phase of the Jisc Effective Analytics Programme.

It certainly did get the conversations going. As is my want, I also tweeted the question and even got a few responses. Ranging from:

to:

to:

to:

to:

to:

I have to say not all of these came up in the conversations I was part of 🙂 But I am looking forward to seeing the results of this assessment exercise after a series of workshops and 1-2-1 interviews with staff.

Watch this space for more details.

Where Sheila's been for the last few weeks: restarting learning analytics at GCU

It’s been a busy couple of weeks what with the start of the new academic session, and I’ve been using up bits of annual leave so haven’t really had the chance to blog for a while. I didn’t want to let another week go by, it’s too easy to let the blogging habit slip, so this is just a quick update post.

One project that is going to be taking up quite a bit of my time this month, is our involvement in the Jisc Effective Analytics programme.  GCU is taking part in the discovery phase of this programme. This means that we are working with consultants, in our case from Blackboard, to assess our institutional readiness, from cultural to infrastructure, for analytics.

My team have been trying to get a pilot project around learning analytics going for about 18 months, however due to various changes internally progress had stalled. However, now we have a new CIO and Director of IT, we are ready to start again. The support from Jisc gives us a great incentive to reappraise our current capabilities, and will give us a trusted, objective view of our capabilities. The analytics infrastructure Jisc are developing also gives us a possible route to develop our provision further as well as share our experiences within the programme and beyond.

We’ve already had meetings with the consulting team and so far we are impressed with the approach they are taking. Just now they are reviewing lots of documentation, including our new very recently launched 2020 Strategy. Having effective data and analytics capabilities will be crucial for us as we work towards the aims and objectives of the strategy.

I’ll be sharing more as the project progresses, particularly nearer the end of this month after the onsite workshops and interviews have taken place.

Coincidentally earlier this week a video of the invited talk I gave at the Talis Aspire conference in April was released (yeah, I take a while to edit me!) Anyway in it I had a bit of a rant about data, analytics etc.  I’m hoping that through this project we will indeed start to get some actionable insights into our learning and teaching and student journeys.

What Sheila's seen this week – rebooting CPD, poundland pedagogy, bricolage and more learning analytics

It’s been another busy week here at Blended Learning Towers, so this post is really a whirlwind reminder of where I’ve been this week and some things that have caught my eye.

On Tuesday I attended the ALT Winter conference – Rebooting CPD at the University of Edinburgh. There were a number of really great presentations as well as the opportunity to have a play with google glass, occulus rift and mincecraft. I was particularly taken with the presentation from James Kieft from Reading College about their staff led staff development programme. I’ve still to explore properly the open version of their Pass it on Friday site but I want to take some time to explore the reflective practices encouraged by activities such as “poundland pedagogy” and “open classrooms”.  There seemed to be a genuine (and growing) spirit of collaboration and sharing of practice. I’m sure there is lots we in HE can learn from our FE colleages.

I also enjoyed Nic Whitton’s “proceed with caution: the application of gamification to learning” presentation. As a “proper” gamer and educator Nic’s presentation gave a very entertaining overview of the  where, why, when and how to/not to use gaming in learning and teaching.  She also stretched my visual note taking ability with references to Mary Poppins.  All in all it was a grand day out and great opportunity to catch up with many colleagues. Thanks to all the speakers and all at ALT for organising what will hopefully be a regular calendar feature.  You can see all my visual notes from the day on my (CC licenced) flickr folder.

Visual notes from Nic Whitton presentation

 

 

 

 

 

 

 

On Wednesday I attended the Universities Scotland Learning and Teaching Committee to give a short overview and introduction to Learning Analytics ( you can access my slides here).  Not quite sure what to make of the meeting, there were lots of nodding heads and questions about cost but it just reinforced how early a stage we are at across the UK sector.  Here a GCU we are certainly still in the finding  and sharing our data internally stage in terms of more general analytics. I am ever hopeful that we will be able to start moving again quickly in the new year when we have a new CIO.  I noticed today too that Jisc has released a report on the ethical and legal challenges of learning analytics – something else to add to the reading list.

Yesterday we had another meeting of our Blended Learning Coffee Club and we had a really good discussion about the merits of open badges. So as part of #BYOD4L in January we’re thinking of running an open badge workshop for staff.  We also were discussed the OU Innovating Pedagogy report. We tried to do a very quick mapping of the 10 innovations listed in the report to actual practice here at GCU. There were some immediate examples including GCU Games On with event based learning, SMILE/SMIRK for learning to learn, lots of examples of bring your own device (including response systems such as nearpod, padlet etc). We had a bit of a smile about bricolage – it’s one of those words isn’t it? It makes you smile when you say it, doesn’t it?   Anyway,  we decided that actually bricolage was pretty much what most people did by default – particularly when they were trying to use technology in their teaching.

Hopefully a more thoughtful blog post soon there are lot of things brewing in my mind just now.  And just a wee reminder – why not take a few minutes to fill in the ALT annual survey? You can even get a badge.


Participate in the ALT Annual Survey 2014

What Sheila's seen this week: innovating pedagogoy, mansplaining, more post digital, analytics awards

If you only read one thing on the interweb this week, then make it this: Men Explain Technology to Me: On Gender, Ed-Tech,and the Refusal to Be Silent by Audrey Watters. Thank you Audrey for a great piece and for introducing me to the term “mansplaining”. I can’t say anymore as my prose is, as they say around here, “mince” compared to Audrey’s so just read it.

My blog has been buzzing with comments (including my first audio comment) on my post about Helen Beetham’s Becoming Post Digital keynote last week. Getting comments (not spam or hate comments see Audrey’s post about that) and seeing an extended conversation unfold is so satisfying for me. It really sustains my motivation for blogging. So thank you to everyone who commented and please feel free to add your thoughts.

I didn’t get round to posting about the OU Innovating Pedagogy report last week and in fact its only been this week that I’ve been able to have a look at it. Brian Kelly was quick off the mark with his summary which provides a useful overview. I was particularly pleased to see event-based learning listed in the report. Our online event GCU Games On fits nicely into this category and provides an alternative for institutions, who don’t have links with organisations such as the BBC.

I also spotted yesterday Jisc have just released a new publication “Learning Analytics, the current state of play in UK higher and further education“. I haven’t had time to read it properly yet but it’s always good to get an overview of what’s happening here in the UK. Looks like it’s still very early days as the report says

“Most interviewees are reluctant to claim any significant outcomes from their learning analytics activities to date – again perhaps demonstrating that it is still early days for the technologies and processes”

That gives me hope that we are not too far off the mark here at GCU. Our analytics adventures have been on the back boiler for a bit but I’m hoping to get them back on track again soon.

Finally, it’s awards time again. David Hopkins has been nominated for the Edublogs awards – he’s getting my vote as his blog is just really useful for anyone involved in implementing ed tech. David has also nominated a fab list of UK folks for the awards. I feel very privileged to have been included. So if you have a minute or two, dear reader, then please vote and get some UK ed tech people winning.

And because every blog should have a picture, here are some clouds from my flight to London yesterday for my first ALT Trustees meeting.

picture of blue skies

What Sheila's seen this week: learning analytics policy for students and lots of open-ness

This week I’ve been doing lots of writing as part of my application for HEA fellowship. I’m doing this via the portfolio route of GCUs AcceleRATE CPD programme. Over the past 6 years, I’ve become increasingly reliant of my blog as my professional memory.  In many ways it is my portfolio and one my main contributions to open practice. As I develop my case studies for my HEA application, it has proved to be an invaluable reference point,  as well as reminding myself that I actually do know a wee bit about a lot of stuff.

Martin Weller wrote a nice post this week on the benefits of an open by default approach. One of the comments highlighted another benefit of being open – that it’s easier to find your own stuff. I have certainly found that this week. In fact, that’s one of the main reasons I keep blogging.

I also spotted that there has been an update to the Open Education Handbook from the Linked Up project  – lovely example of open practice creating a resource on open education.

It was also great to see this article on the OU’s policy on the ethical use of student data for learning analytics.  I know Sharon Slade has been working on this for a number of years now. The policy and the FAQ (both available on the OU website) are really useful – not just for students but for anyone who is thinking about or implementing learning analytics. Hopefully it will be available via CC soon too. Another win for open-ness.

Reusing Open Resources with a dash of learning analytics

Following the special edition of JIME, the whole book, Reusing Open Resources, is now in print and available here.  It includes a chapter on Analytics for Education written by Lorna Campbell, Martin Hawksey and myself. It’s almost a year since we wrote the chapter so its not completely up to date, but I think it is still a very useful overview.

The book editors, Chris Pegler and Allison Littlejohn have done a great job putting the book together. It offers a fresh perspective on the reuse of open resources for learning by placing learning and learners (rather than resources) as the central focus and by taking into consideration all forms of open learning, formal, non-formal and informal learning, not only open education. Like them, I hope the (sometimes opposing) views expressed in the book feed into debates across the related fields of education, professional learning and lifelong learning.

Screen shot of book homepage

css.php