Reflection on CPD, Recognition of Prior Informal Learning, digital presence and (Bb) portfolios

photo of reflections on River Clyde

(Image: reflections on the River Clyde from my flickr collection)

I have just finished writing a reflective portfolio as part of our Accelerate CPD framework for accreditation as Senior Fellow of the Higher Education Academy. The experience has,  as they say, been quite a ride.

Having the opportunity to take this portfolio based approach is, imho, really useful. Particularly for someone like me who has been around for quite a bit, doing lots of “stuff,  hasn’t had much conventional learning and teaching experience.  In fact until I started at GCU I had never even considered applying for recognition by the HEA.  As I wasn’t actually teaching or a “proper academic” I just presumed I wouldn’t be able to.

Luckily for me, our framework has a number of routes to recognition including RPiL. The structure of the portfolio route here involves working with a dedicated mentor to help develop two case studies and a reflective summary mapped to the UKPSF.  My mentor was my GCU LEAD colleague Sam Ellis.  Sam’s support and guidance throughout has been fantastic. From explaining the stages involved and the dreaded framework mapping, to teasing out areas suitable for case studies, to just keeping me on track he has been a constant source of calm and reassurance.

Structured self reflection is, I think, one of those painful you know it is good for you but your always too busy doing other things to do it kind of things.  Perhaps because I was aware that starting  a new part of my career I really needed to validate what I had already done.  So for  me the RPiL route was actually very welcome, though daunting.  I have been involved in so many programmes, projects and standards malarkey it was difficult to think what I could possibly turn into two coherent, relevant case studies.  However I did manage to. Over the past 8 moths or so I have been doing a lot of thinking about them and eventually actually writing, rewriting and mapping them to the UKPSF.

A crucial part of the mapping process is evidence.  This is where I really feel that my self described unconventional career and in particularly my blogging and open, reflective practice has really paid off. It was really easy for me to look back and find blog posts, presentations. meeting notes and lots of shared reflection. These not only acted as evidence but also triggered my memory about events/thoughts/experiences. My digital presence really paid off.  In fact, in classic displacement mode, before writing it, I created a timeline for one of my case studies that linked to loads of posts and presentations.  What the evaluation panel makes of it all remains to be seen . . .

Over the past couple of days I have been putting all my “evidence” into a portfolio in our VLE. Perhaps I was slightly blasé about this bit. The hard bits had been done so surely it was just a bit of a cut and past job. Well to an extent it was.  However, this is where I have to have a little rant and moan about some really simple things that were just so frustrating.

Our VLE is Blackboard, and to start on a positive note, there have been major improvements to its portfolio functionality over the last couple of years. Setting up a basic portfolio is pretty straightforward as is creating templates. Sharing a snap shot ( a viewable, non editable version) of the portfolio with anyone via email works a treat.  The majority of my evidence was hyperlinked so no need to worry about creating “aretfacts” and storing them (that’s for another post).

Hyperlinking in the Bb text editor is a wee bit clunky but fairly straightforward.  The most frustration thing for me was formatting. There seemed to be far too much random, rogue formatting weirdness. Having fonts and line spacing apparently randomly changing within pages, and despite using the text editor to reformat without success, is, to put it mildly, somewhat frustrating. More of a what you see you don’t get scenario, which kind of defeats the purpose of WYSIWYG editors. Looking at the Bb HTML editor to trying and figure out what is going on html wise is pretty scary even for a tough old bird like me.

Now, I know that a lot of this weirdness has probably been caused by my cutting and pasting (in this case from a google doc). However in terms of actual use, I think there is a very strong use case around drafts of extended pieces of writing, particularly reflective pieces of writing such as case studies, happening outside the VLE in a sharable document of a learners choices (e.g. google docs, evernote, one drive)  and being then the text being pasted into a portfolio structure.

Ah, I hear you say but you could just upload a beautifully formatted word/PDF document. Well yes, but the whole point of an online portfolio, particularly a reflective one such as this, is that the reader (evaluator) doesn’t have to open multiple other documents, they can just read sections within the portfolio structure. Any additional documents should be evidence, not the main body of the portfolio.

I am lucky in that I know a bit of HTML and wasn’t afraid to use that little bit of knowledge along with an HTML cleaner site  to get things appearing as they should. But it took ages . . . and it’s those little things that leads everyone to say “ I hate Blackboard” and forget about some of the things that actually do work well.  So come on Bb,  don’t just wait for the new design, lets sort out the text editor in the version(s) we are all using just now.

 

Shared services in HE – what really matters to you?

Last week I attended the Jisc Learning and Teaching Practice Experts Group meeting in Birmingham. As ever this was a really well organised, informed, informing, collaborative experience.  It was the 31st meeting and there really was a sense of community at the event. You can get a feeling of this from the tweets from the day

Tags explorer view of #jiscexperts14
Tags explorer view of #jiscexperts14

Sarah Davies, Head of Change Implementation Support for Education/Student, started the day by giving an update on the changes to Jisc, how it has been refocusing its activities in light of the Wilson review to achieve large scale impact based on sector driven priorities.  Sarah’s slides give a good overview.  Part of this involves developing new areas of impact, co-design methodologies with Jisc’s core community, regular reviews to ensure programmes/projects are “tightly steered and gated” whilst at the same time allowing Jisc to be agile and try “new things”.  The inevitable “21st century challenge” for most organisations.

screen shot of Jisc Strategic Framework Impact Areas
Jisc Strategic Framework Impact Areas

Shared services are not new to Jisc, and they are still very much at the heart of their outputs and deliverables. Just what constitutes as a service can be a bit of a fuzzy area.  Traditionally in IT and Jisc terms, shared services are focused on technical infrastructure. Being able to share development costs across the sector is of course a “good thing” and long may it continue. As we all know technical services can’t work in isolation, people and processes are what make any platform successful. This is where the other side of the shared services that Jisc provides such as information, guidance, synthesis of practice from programmes come into play.

Sarah asked if we could share examples of where/how we had used the guidance/information services provided by Jisc.  Since starting at GCU almost six months ago I can honestly say that I refer to Jisc “stuff” almost daily. I realise I could be a perceived as being a bit biased having worked for a Jisc funded innovation centre for many years. However as  you know, dear reader, I wouldn’t recommend anything if it I didn’t believe it was useful.

Just now we are looking at portfolio provision and Jisc resources have been invaluable as a trusted source for definition, as well as examples of practice to share with colleagues. I can’t  accurately quantify how much time they have saved me, but I know we have been able to pull together things much faster than if I had had to search for trusted information.  Similarly we are developing guidelines for e-submission processes. So the work from the recent Assessment projects and the new briefing papers on EMA are so timely for us. I think they will save us at least a couple of weeks of research and again, I can’t emphasis the importance of this enough,  are based on current experience within the UK HE sector.

The Learning and Teaching Practice Experts Group is a key example of a shared service and effective way for Jisc to engage with its core community as it starts to realise its new strategic framework. It’s increasingly one that matters to me and my working practice.

Learning Analytics for Assessment and Feedback Webinar, 15 May

**update 16 May**
Link to session recording

Later this week I’ll be chairing a (free) webinar on Learning Analytics for Assessment and Feeback. Featuring work from three projects in the current Jisc Assessment and Feedback Programme. I’m really looking forward to hearing first hand about the different approaches being developed across the programme.

“The concept of learning analytics is gaining traction in education as an approach to using learner data to gain insights into different trends and patterns but also to inform timely and appropriate support interventions. This webinar will explore a number of different approaches to integrating learning analytics into the context of assessment and feedback design; from overall assessment patterns and VLE usage in an institution, to creating student facing workshops, to developing principles for dashboards.”

The presentations will feature current thinking and approaches from teams from the following projects:
*TRAFFIC, Manchester Metropolitan University
*EBEAM, University of Huddersfield,
*iTeam, University of Hertfordshire

The webinar takes place Wednesday 15 May at 1pm (UK time) and is free to attend. A recording will also be available after the session. You can register by following this link.

Acting on Assessment Analytics – new case study

Despite the hype around it, getting started with learning analytics can be a challenge for most everyday lecturers. What can you actually do with data once you get it? As more “everyday” systems (in particular online assessment tools) are able to provide data and/or customised reports, it is getting easier to start applying and using analytics approaches in teaching and learning.  

The next case study in our Analytics series focuses on the work of Dr Cath Ellis and colleagues at the University of Huddersfield. It illustrates how they are acting on the data from their e-submission system, not only to enhance and refine their feedback to students, but also to help improve their approaches to assessment and overall curriculum design.  
 
At the analytics session at #cetis13 Ranjit Sidhu pointed out that local data can be much more interesting and useful than big data. This certainly rings true for teaching and learning.  Using very local data, Cath and her colleagues are developing a workshop approach to sharing generic assessment data with students in a controlled and emotionally secure environment. The case study also highlights issues around data handling skills and the need for more evidence of successful interventions through using analtyics. 

You can access the full case study here

We are always looking for potential case studies to add to our collection, so if you are doing some learning analtyics related work and would be willing to share your experiences in this way, then please get in touch.

Bye bye #edcmooc

So #edcmooc is now over, our digital artefacts have been submitted and reviewed and we all now move on.

I thought it would be useful to reflect on the final submission and peer review process as I have questioned how that would actually work in a couple of earlier posts. The final submission for the course was to create a digital artefact which would be peer reviewed.

The main criteria for creating the artefact were:

* it will contain a mixture of two or more of: text, image, sound, video, links.
* it will be easy to access and view online.
* it will be stable enough to be assessed for at least two weeks.

We had to submit a url via the Coursera LMS and then we were each assigned 3 other artefacts to assess. You had the option to assess more if you wished. The assessment criteria were as follows:

1. The artefact addresses one or more themes for the course
2. The artefact suggests that the author understands at least one key concept from the course
3. The artefact has something to say about digital education
4. The choice of media is appropriate for the message
5. The artefact stimulates a reaction in you, as its audience, e.g. emotion, thinking, action

You will assign a score to each digital artefact

0 = does not achieve this, or achieves it only minimally
1 = achieves this in part
2 = achieves this fully or almost fully

This is the first time I’ve done peer review and it was a very interesting process. In terms of the electronic process, the system made things very straightforward, and there was time to review draft submissions before submitting. I’m presuming that artefacts were allocated on a random basis too. On reflection the peer process was maybe on the “lite” side, but given the scope and scale of this course I think that is entirely appropriate.

My three allocated artefacts were really diverse both in style, content and substance. Whilst reviewing I did indeed reflect back on what I had done and wished I had the imagination and time of some of my peers, and I could have spent hours going through more but I had to stop myself. Overall I am still satisfied with my submission which you can explore below or follow this link.

2/2 all round for me and some very positive comments from my peers, so thank you – although as one of my reviewers did point out I maybe did push the time limits a bit far:

“The choice of the media is also apt but I guess the only little drawback is that the artifact far exceeds the guidelines on how big the artifact should be (actually it’s a gist of the entire course and not a little five-minute artifact!). “

Overall I really enjoyed #edcmooc, it made me think about things from different perspectives as well as confirming some of my personal stances on technology in education. It was well paced and I liked that it used openly available content where possible. Now I’m bit more experienced at MOOC-ing didn’t take up too much of my time. The course team made some subtle adjustments to the content and instruction over the duration which again was entirely appropriate and showed they were listening if not talking to everyone. I didn’t feel a lack of tutor contact, but then again I didn’t interact in the discussion spaces as much as I could have, and this is also an topic area where I was relatively comfortable exploring at my own pace.

It’s also been quite a counter balance to the #oldsmooc course I’m also doing (which started before #edcmooc and finishes next week), but I’ll share more about that in another post.

Also feel free to assess my artefact and share your comments here too using the criteria above.

**Update, I’ve just received an email from the course team. Apparently the process didn’t work as smoothly for some as it did for me. They are investigating and encouraging people who couldn’t share their artefacts to use the course forums. Hopefully this will get sorted soon.

eAssessment Scotland – focus on feedback

Professor David Boud got this year’s eAssessment Scotland Conference off to a great start with his “new conceptions of feedback and how they might be put into practice” keynote presentation by asking the fundamental question ‘”what is feedback?”

David’s talk centred on what he referred to as the “three generations of feedback”, and was a persuasive call to arms to educators to move from the “single loop ” or “control system” industrial model of feedback to a more open adaptive system where learners play a central and active role.

In this model, the role of feedback changes from being passive to one which helps to develop students allowing them to develop their own judgement, standards and criteria. Capabilities which are key to success outside formal education too. The next stage from this is to create feedback loops which are pedagogically driven and considered from the start of any course design process. Feedback becomes part of the whole learning experience and not just something vaguely related to assessment.

In terms of technology, David did give a familiar warning that we shouldn’t enable digital systems to allow us to do more “bad feedback more efficiently”. There is a growing body of research around developing the types of feedback loops David was referring to. Indeed the current JISC Assessment and Feedback Programme is looking at exactly the issues brought up in the keynote, and is based on the outcomes of previously funded projects such as REAP and PEER. And the presentation from the interACT project I went to immediately after the keynote, gave an excellent overview of how JISC funding is allowing the Centre for Medical Education in Dundee to re-engineering its assessment and feedback systems to “improve self, peer and tutor dialogic feedback”.

During the presentation the team illustrated the changes to their assessment /curriculum design using an assessment time line model developed as part of another JISC funded project, ESCAPE, by Mark Russell and colleagues at the University of Hertfordshire.

Lisa Gray, programme manager for the Assessment and Feedback programme, then gave an overview of the programme including a summary of the baseline synthesis report which gives a really useful summary of the issues the projects (and the rest of the sector ) are facing in terms of changing attitudes, policy and practice in relation to assessment and feedback. These include:
*formal strategy/policy documents lagging behind current development
*educational principles are rarely enshrined in strategy/policylearners are not often actively enaged in developing practice
*assessment and feedback practice doesn’t reflect the reality of working life
*admin staff are often left out of the dialogue
*traditional forms of assessment still dominate
*timeliness of feedback are still an issue.

More information on the programme and JISCs work in the assessment domain is available here.

During the lunch break I was press-ganged/invited to take part in the live edutalk radio show being broadcast during the conference. I was fortunate to be part of a conversation with Colin Maxwell (@camaxwell), lecturer at Carnegie College, where we discussed MOOCs (see Colin’s conference presentation) and feedback. As the discussion progressed we talked about the different levels of feedback in MOOCs. Given the “massive” element of MOOCs how and where does effective feedback and engagement take place? What are the afordances of formal and informal feedback? As I found during my recent experience with the #moocmooc course, social networks (and in particular twitter) can be equally heartening and disheartening.

I’ve also been thinking more about the subsequent twitter analysis Martin has done of the #moocmooc twitter archive. On the one hand, I think these network maps of twitter conversations are fascinating and allow the surfacing of conversations, potential feedback opportunities etc. But, on the other, they only surface the loudest participants – who are probably the most engaged, self directed etc. What about the quiet participants, the lost souls, the ones most likely to drop out? In a massive course, does anyone really care?

Recent reports of plagiarism, and failed attempts at peer assessment in some MOOCs have added to the debate about the effectiveness of MOOCs. But going back to David Boud’s keynote, isn’t this because some courses are taking his feedback mark 1, industrial model, and trying to pass it off as feedback mark 2 without actually explaining and engaging with students from the start of the course, and really thinking through the actual implications of thousands of globally distributed students marking each others work?

All in all it was a very though provoking day, with two other excellent keynotes from Russell Stannard sharing his experiences of using screen capture to provide feedback, and Cristina Costa on her experiences of network feedback and feeding forward. You can catch up on all the presentations and join in the online conference which is running for the rest of this week at the conference website.

Making assessment count, e-Reflect SUM released

Gunter Saunders and his team on the Making Assessment Count project (part of the current JISC Curriculum Delivery programme), have just released a SUM (service useage model) describing the process they have introduced to engage students (and staff) in the assessment process.

“The SUM presents a three stage framework for feedback to students on coursework. The SUM can act to guide both students and staff in the feedback process, potentially helping to ensure that both groups of stakeholders view feedback and its use as a structured process centred around reflection and discussion and leading to action and development.”

You can access the e-Reflect SUM here.

Assessment technologies in use in the Curriculum Delivery Programme

Developing practice around assessment is central to a number of the Curriculum Delivery projects. There has been an emphasis on improving feedback methods and processes, with a mixture of dedicated formal assessment tools (such as Turnitin) and more generic tools (such as excel, google forms, adapting moodle modules) being used. The later often proving a simple and effective way to trial new pedagogic methodologies, without the need for investment in dedicated software.

Excel
*Ebiolabs (excel macros embedded into moodle for marking)
*ESCAPE (WATS – weekly assessment tutorial sheets, again used for submission, also generates a weekly league table)

EVS
*Escape

Turnitin
*Making the new diploma a success
*Integrative Technologies Project

Moodle
*Cascade (submission extension)

ARS
*Integrative Technologies Project

Google forms
*Making Assessment Count

IMS QTI
None of the projects have actually implemented IMS QTI, however the Escape project did highlight it in their project plan, but didn’t actually need to use the specification for the work they undertook.

More information on the projects can be found by following the specific links in the text. More detailed information about the the technological approaches is also available from our PROD database. Specific assessment resources (including case studies) are also being made available through the Design Studio.

css.php