This is probably going to be an even more incoherent ramble than normal but I have been trying to write posts around a number of things for the last couple of weeks I’m going to try and merge them.
A couple of weeks ago, I read this post by David Wiley. At the time I tweeted:
need to consider this post more but I am still uneasy about focus on platforms https://t.co/ldfOeEFFm2
— Sheila MacNeill (@sheilmcn) January 26, 2017
I confess to a more than a bit of this sentiment, and not just in relation to OER, “Much of the OER movement has a bad attitude about platforms.” I am always wary when the focus is on developing platforms and not developing the people who will use these platforms.
I was once in a meeting where I put forward the “people and process not platforms and products” case. I was told that what was being discussed was platform “in the Californian sense of platform”. . . I’m sure a classic WTF look must have passed over my face, but it was explained that this meant people as well as technology. Geography aside, three years later this sense of platform doesn’t seem to be that wide spread or acknowledged. Maybe I need to go to California. But I digress.
Not long before the Wiley post I was reading the Pearson White Paper on learning design. It caused me a bit of unease too. Part of me was delighted to see learning design being recognised by, whatever might happen to them, a significant player in the education technology provider field. Using learning design to help product design is a bit of a no brainer. Technology should be driven by educational need or as Pearson put it :
“Products and systems that effectively leverage learning design can deliver superior learning outcomes.”
One example in the paper referred to work they had done in social science classes
“we quickly recognized that students were easily distracted by conventional textbooks. This told us we needed to eliminate distractions: any extraneous cognitive load that doesn’t promote learning. Fortunately, our learning design work reveals many proven techniques for accomplishing this. REVEL segments all content into manageable pieces and presents it via a consistent structure. It provides strong signaling cues to highlight key material and places all relevant content on screen simultaneously to offer a continuous, uninterrupted experience”
Which kind of related to this point from the Wiley post:
“Our fixation on discovery and assembly also distracts us from other serious platform needs – like platforms for the collaborative development of OER and open assessments (assessments are the lifeblood of this new generation of platforms), where faculty and students can work together to create and update the core materials that support learning in our institutions. Our work in OER will never be truly sustainable until faculty and students jointly own this process, and that can’t happen until a new category of tools emerges that enables and supports this critical work. (Grant money for OER creation won’t last forever.)
And don’t even start trying to explain how the LMS is the answer. Just don’t. “
Well of course Pearson do try to explain that:
“As testing progresses, we can overcome problems that compromise outcomes and build a strong case that our design will support learning. The very same work also helps us tightly define assessments to find out if the product works in real classrooms”
Of course they don’t really touch on the OER aspect (all their learning design stuff has been made available with CC goodness) but I’ll come back to that.
That phrase “if the product works”, I keep coming back to that. So on the one hand I have to be pleased that Pearson are recognising learning design. I have no argument with their core principles . I agree with them all. But I am still left with the niggle around the assumption that the platform will “do” all the learning design for both staff and students. That underlying assumption that if only we had the right platform all would be well, everything could be personalised, through data and analytics and we’d have no retention issues. That niggles me.
I was part of a plenary panel at the HESPA conference last week called “the future of learner analytics” where a number of these issues came up again. The questions asked by this group of educational planners really stimulated a lot of debate. On reflection I was maybe a bit of a broken record. I kept coming back not to platforms but people and more importantly time. We really need to give our staff and students (but particularly our staff) time to engage with learning analytics. Alongside the technical infrastructure for learning analytics we need to asking where’s the CPD planning for analytics? They need to go hand in hand. Cathy Gunn, Jenny McDonald and John Milne’s excellent paper “the missing link for learning from analytics” sums this up perfectly:
“there is a pressing need to add professional development and strategies to engage teachers to growing range of learning analytics initiatives If these areas are not addressed, adoption of the quality systems and tools that are currently available or underdevelopment may remain in the domain of the researchers and data analysis experts”
There seems to be an assumption that personalisation of learning is a “good thing” but is it? Going back to learning design, designing engaging learning activities is probably more worthwhile and ultimately more useful to students and society than trying to create homogenised, personalised chunked up content and assessments. Designing to create more effective engagement with assessment and feedback is, imho, always going to be more effective than trying to design the perfect assessment platform.
In terms of assessment, early last week I was also at a Scotbug (our regional Blackboard user group) meeting, where I was in a group where we had to design an assessment system. This is what we came up with – the flipped assessment – aka student generated assessments.
Not new, but based on pedagogy and technology that is already in use ( NB there’s been a really great discussion around some of this in the ALT list this weekend). I don’t think we need any new platforms for this type of approach to assessment and feedback – but we do need to think about learning design (which encapsulates assessment design) more, and give more time for CPD for staff to engage more with the design process and the technologies they either have to, use or want to use. This of course all relates to digital capability and capacity building.
So whilst we’re thinking about next gen platforms, learning environments, please let’s not forget people. Let’s keep pressing for time for staff CPD to allow the culture shifts to happen around understand the value of OER, of sharing, of taking time to engage with learning design and not just having to tweak modules when there’s a bit of down time.
People are the most important part of any learning environment – next gen, this gen, past gen. But people need time to evolve too, we can’t forget them or try to design out the need for them for successful learning and teaching to take place. Ultimately it’s people that will make the product work.