If the product works, but what about the people?

This is probably going to be an even more incoherent ramble than normal but I have been trying to write posts around a number of things for the last couple of weeks I’m going to try and merge them.

A couple of weeks ago, I read this post by David Wiley. At the time I tweeted:

I confess to a more than a bit of this sentiment, and not just in relation to OER,   “Much of the OER movement has a bad attitude about platforms.” I am always wary when the focus is on developing platforms and not developing the people who will use these platforms.

I was once in a meeting where I put forward the “people and process not platforms and products” case. I was told that what was being discussed was platform “in the Californian sense of platform”.  . .  I’m sure a classic WTF look must have passed over my face, but it was explained that this meant people as well as technology.  Geography aside, three years later this sense of platform doesn’t seem to be that wide spread or acknowledged. Maybe I need to go to California. But I digress.

Not long before the Wiley post I was reading the Pearson White Paper on learning design.  It caused me a bit of unease too.  Part of me was delighted to see learning design being recognised by, whatever might happen to them, a significant player in the education technology provider field.   Using learning design to help product design is a bit of a no brainer. Technology should be driven by educational need or as Pearson put it :

“Products and systems that effectively leverage learning design can deliver superior learning outcomes.”

One example in the paper referred to work they had done in social science classes

“we quickly recognized that students were easily distracted by conventional textbooks. This told us we needed to eliminate distractions: any extraneous cognitive load that doesn’t promote learning. Fortunately, our learning design work reveals many proven techniques for accomplishing this. REVEL segments all content into manageable pieces and presents it via a consistent structure. It provides strong signaling cues to highlight key material and places all relevant content on screen simultaneously to offer a continuous, uninterrupted experience”

Which kind of related to this point from the Wiley post:

“Our fixation on discovery and assembly also distracts us from other serious platform needs – like platforms for the collaborative development of OER and open assessments (assessments are the lifeblood of this new generation of platforms), where faculty and students can work together to create and update the core materials that support learning in our institutions. Our work in OER will never be truly sustainable until faculty and students jointly own this process, and that can’t happen until a new category of tools emerges that enables and supports this critical work. (Grant money for OER creation won’t last forever.)

And don’t even start trying to explain how the LMS is the answer. Just don’t. “

Well of course Pearson do try to explain that:

“As testing progresses, we can overcome problems that compromise outcomes and build a strong case that our design will support learning. The very same work also helps us tightly define assessments to find out if the product works in real classrooms”

Of course they don’t really touch on the OER aspect (all their learning design stuff has been made available with CC goodness) but I’ll come back to that.

That phrase “if the product works”, I keep coming back to that.  So on the one hand I have to be pleased that Pearson are recognising learning design. I have no argument with their core principles .  I agree with them all.  But I am still left with the niggle around the  assumption that the platform will “do” all the learning design  for both staff and students. That underlying  assumption that if only we had the right platform all would be well, everything could be personalised, through data and analytics and we’d have no retention issues.  That niggles me.

I was part of a plenary panel at the HESPA conference last week called “the future of learner analytics” where a number of these issues came up again.   The questions asked by this group of educational planners really stimulated a lot of debate. On reflection I was maybe a bit of a broken record.  I kept coming back not to platforms but people and more importantly time.  We really need to give our staff and students (but particularly our staff) time to engage with learning analytics.   Alongside the technical infrastructure for learning analytics we need to asking where’s the CPD planning for analytics?  They need to go hand in hand. Cathy Gunn, Jenny McDonald and John Milne’s excellent paper “the missing link for learning from analytics” sums this up perfectly:

there is a pressing need to add professional development and strategies to engage teachers to growing range of learning analytics initiatives If these areas are not addressed, adoption of the quality systems and tools that are currently available or underdevelopment may remain in the domain of the researchers and data analysis experts” 

There seems to be an assumption that personalisation of learning is a “good thing” but is it?  Going back to learning design, designing engaging learning activities is probably more worthwhile and ultimately more useful to students and society than trying to create homogenised, personalised chunked up content and assessments.  Designing to create more effective engagement with assessment and feedback is, imho, always going to be more effective than trying to design the perfect assessment platform.

In terms of assessment, early last week I was also at a Scotbug (our regional Blackboard user group) meeting, where I was in a group where we had to design an assessment system. This is what we came up with – the flipped assessment – aka student generated assessments.

img_0107

Not new, but based on pedagogy and technology that is already in use ( NB there’s been a really great discussion around some of this in the ALT list this weekend).   I don’t think we need any new platforms for this type of approach to assessment and feedback – but we do need to think about learning design (which encapsulates assessment design) more, and give more time for CPD for staff to engage more with the design process and the technologies they either have to,  use or want to use.  This of course all relates to digital capability and capacity building.

So whilst  we’re thinking about next gen platforms, learning environments, please let’s not forget people. Let’s keep pressing for time for staff CPD to allow the culture shifts to happen around understand the value of OER, of sharing, of taking time to engage with learning design and not just having to tweak modules when there’s a bit of down time.

People are the most important part of any  learning environment – next gen, this gen, past gen. But people need time to evolve too, we can’t forget them or try to design out the need for them for successful learning and teaching to take place. Ultimately it’s people that will make the product work.

Where Sheila's been this week – codesigning next gen learning environments with Jisc

You may or may not be aware of Jisc’s current co-design consultation exercise with the HE/FE sector.  The co-design approach is a way to try and ensure that Jisc developments are supportive and representative of the needs of the sector.   Building on feedback from the first iteration of the process, this time around there has been a concerted effort to get wider sectoral involvement in the process through various methods, including social media, blog posts, tweet chats and voting.

Yesterday, along with about 30 others, I attended a face to face meeting to explore, review and discuss the results of the process and feedback on the six “big” challenges identified by Jisc.

  1. What does the imminent arrival of the intelligent campus mean for universities and colleges?
  2. What should the next generation of digital learning environments do?
  3. What should a next-generation research environment look like?
  4. Which skills do people need to prepare for research practice now and in the future?
  5. What would truly digital apprenticeships look like?
  6. How can we use data to improve teaching and learning?

You can see the results of the voting here too.

The voting process does need some refinement as Andy McGregor was clear to point out, and we really used it and the comments as a guide for the discussions.  Personally I found the voting process a bit cumbersome – having to fill out a google doc for each one. I can see why Jisc wanted to get all that information but I would have preferred something a bit more instant with the option of giving more detailed information. That might have encouraged me to cast more than one vote  . . .

I joined the next generation learning environments discussion. I had been quite taken with the pop up VLE notion but as the discussion evolved it became clearer to me that actually the idea articulated so well by Simon Thomson (Leeds Beckett Uni) of connecting institutional and user owned tech was actually a much stronger proposition and in a way the pop up VLE would fall out of that.

The concept is really building on the way that IFTT (if this then than) works, however with a focus on connecting institutional systems to personal ones.  Please, please read Simon’s post as it explains the rationale so clearly.   I use IFTT and love the simplicity of being able to connect my various online spaces and tools, and extending that into institutional systems seems almost a no brainer.

We talked about space a lot in our discussion, personal space, institutional space etc (Dave White has a good post on spaces which relates to this).  For both staff and students it can be quite complex to manage, interact in and understand these spaces.

We (teachers, support staff,  IT, the institution) are often a bit of obsessed with controlling spaces.  We do need to ensure safety and duty of care issues are dealt with but activity around learning doesn’t always need to take place in our spaces e.g. the VLE.  Equally we (staff) shouldn’t feel that they have to be in all the spaces where students maybe talking about learning. If students want to discuss their group activity on snapchat, what’s app, Facebook then let them. They can manage their interaction in those spaces. What we need to be clear on is the learning activity, the key interactions and expectations of outputs and in which spaces the learning activities/outputs need to be.  The more connected approach advocated by Simon could allow greater ease of connection between spaces for both staff and students.

Providing this type of  architecture (basically building and sharing more open APIs)  is not trying to replace a VLE, portfolio system etc, but actually allowing for greater choice around engagement in,  and sharing of,  learning activity. If I like writing in evernote (as I do) why can’t I just post something directly into a discussion forum in our VLE? Similarly if our students (as ours do) have access to one note and are using it, why can’t they choose to share their work directly into the VLE?  Why can’t I have my module reading lists easily saved into a google doc?

This isn’t trying to replace integrations such as LTI, or building blocks that bring systems/products  into systems. This is much more about personalisation and user choice  around notifications, connections and sharing into systems that you (need to) use.  It’s lightweight, not recreating any wheels but just allowing more choice.

So at  a university level you could have a set of basic connections (recipes) illustrating how students and staff and indeed the wider community could interact with institutionally provided systems, and then staff/students decided which ones (if any) they want to use, or create their own.  Ultimately it’s all about user choice. If you don’t want to connect that way then you don’t have to. It’s lightweight, not recreating any wheels but just allowing more choice

As well as helping to focus on actual learning activity, I would hope that this approach would helping institutions  to think about their core institutional provision, and “stuff that’s out there and we all are using” – aka byod.  It would also hopefully allow for greater ease of experimentation without having to get a system admin support to try something out in the VLE.

I would hope this would also help extend support and understanding of the need for non monolithic systems and get edtech vendors to build more flexible interaction/integration points.

Anyway hopefully there will be more soon from Jisc, and Simon actually has some funding to trying an build a small prototype based on this idea.  Jisc will also be sharing the next steps from all the ideas over the coming weeks.  Hopefully this idea is simple and agile enough to get into the Jisc R&D pipeline.

screen-shot-2017-02-03-at-09-41-08
Jisc R&D Pipeline
css.php