Design Studio update: Transforming Assessment and Feedback

Those of you who regularly read this blog, will (hopefully) have noticed lots of mentions and links to the Design Studio. Originally built as a place to share outputs from the JISC Curriculum Design and Delivery Programmes, it is now being extended to include ouputs from a number of other JISC funded programmes.

The Transforming Assessment and Feedback area of the Design Studio now has a series of pages which form a hub for existing and emergent work on assessment and feedback of significant interest. Under a series of themes, you can explore what this community currently know about enhancing assessment and feedback practice with technology, find links to resources and keep up to date with outputs from the Assessment and Feedback and other current JISC programmes.

Assessment and Feedback themes/issues wordle

This is a dynamic set of resources that will be updated as the programme progresses. Follow this link to explore more.

My memory of eAssessment Scotland

Along with around another 270 people, attended the eAssessment Scotland Conference on 26 August at the University of Dundee. It was a thought provoking day, with lots of examples of some innovative approaches to assessment within the sector.

Steve Wheeler got the day off to a great start talking us through some of the “big questions” around assesment, for example is it knowledge or wisdom that we should be assessing? and what are the best ways to do this? Steve also emphasised the the evolving nature of assessment and the need to share best practice and introduced many of us to the term “ipsative assessment”. The other keynotes complemented this big picture view with Becka Coley sharing her experiences of the student perspective on assessment and Pamela Kata showing taking us through some of the really innovative serious games work she is doing with medical students. The closing keynote from Donald Clark again went back to some of the more generic issues around assessment and in particular assessment in schools and the current UK governments obsession with maths.

There is some really great stuff going on in the sector, and there is a growing set of tools, and more importantly evidence of the impact of using e-assessment techniques (as highlighted by Steve Draper, University of Glasgow). However it does seem still quite small scale. As Peter Hartley said e-assessment does seem to be a bit of a cottage industry at the moment and we really more institutional wide buy in for things to move up a gear. I particularly enjoyed the wry, slightly self-deprecating presentation from Malcolm MacTavish (University of Abertay Dundee) about his experiments with giving audio feedback to students. Despite being now able to evidence the impact of audio feedback and show that there were some cost efficiencies for staff, the institution has now implemented a written feedback only policy.

Perhaps we are on the cusp a breakthrough, and certainly the new JISC Assessment and Feedback programme will be allowing another round of innovative projects to get some more institutional traction.

I sometimes joke that twitter is my memory of events – I tweet therefore I am mentality 🙂 And those of you who read my blog will know I have experimented with the Storify service for collating tweets from events. But for a change, here is my twitter memory of the day via the memolane service.

Assessment and Feeback – the story from 2 February

Many thanks to my colleague Rowin Young and the Making Assessment Count project at the University of Westminster for organising a thoroughly engaging and thought provoking event around assessment and feedback yesterday. I just got my storify invite through this morning, so to give a flavour of the day here is a selected tweet story from the day.

A capital day for assessment projects

Last Monday CARET, University of Cambridge hosted a joint workshop for the current JISC Capital Programme Assessment projects. The day provided an opportunity for the projects to demonstrate how the tools they have been developing work together to provide the skeleton of a complete assessment system from authoring to delivery to storage. Participants were also encouraged to critically review progress to date and discuss future requirements for assessment tools.

Introducing the day Steve Lay reminded delegates of some of the detail of the call under which the projects had been funded. This included a focus on “building and testing software tools, composite applications and or implementing a data format and standards for to defined specification” – in this case QTI. The three funded projects have built directly on the outcomes of previous toolkits and demonstrator activities of the e-framework.

The morning was given over to a demo from the three teams, from Kingston, Cambridge and Southampton Universities respectively, showing how they interoperated by authoring a question in AQuRAte then storing it in Minibix and finally delivering it through ASDEL.

Although the user-interfaces still need a bit of work, the demo did clearly show how using a standards based approach does lead to interoperable systems and that the shorter, more iterative development funding cycle introduced by JISC can actually work.

In the afternoon there were two breakout sessions one dealing with the technical issues around developing and sustaining an open source community, the other looking innovations in assessment. One message that came through from both sessions was the need for more detailed feedback on what approaches and technologies work in the real world. Perhaps some kind of gap analysis between the tool-set we have just now and the needs of the user community combined with more detailed use cases. I think that this approach would certainly help to roadmap future funding calls in the domain as well as helping inform actually practice.

From the techie side of the discussion there was a general feeling of there still being lots of uncertainty about the development of an open source community. How/will/can the 80:20 rule of useful code be reversed? The JISC open source community is still relatively immature and the motivations for be part of it are generally because developers are being paid to be part of it – not because it is the best option. There was a general feeling that more work is needed to help develop, extend and sustain the community and that it is at quite a critical stage in its life-cycle. One suggestion to help with this was the need for a figure head to lead the community – so if you fancy being Mr/Mrs QTI do let us know:-)

More notes from the day are available for the projects’ discussion list.

Assessment, Packaging – where, why and what is going on?

Steve Lay (CARET, University of Cambridge) hosted the joint Assessment and EC SIG meeting at the University of Cambridge last week. The day provided and opportunity to get an update on what is happening in the specification world, particularly in the content packaging and assessment areas and compare that to some really world implementations including a key interest – IMS Common Cartridge.

Packaging and QTI are intrinsically linked – to share and move questions/items they need to be packaged – preferably in an interoperable format:-) However despite recent developments in both the IMS QTI and CP specifications, due to changes in the structure of IMS working groups there have been no public releases of either specifications for well over a year. This is mainly due to the need for at least two working implementations of a specification before public release. In terms of interoperability, general uptake and usabillity this does seem like a perfectly sensible change. But as ever, life is never quite that simple.

IMS Common Cartridge has come along and has turned into something of a flag-bearer for IMS. This has meant that an awful lot of effort from some of the ‘big’ (or perhaps ‘active’ would be more accurate) members of IMS has been concentrated on the development of CC and not pushing implementation of CP1.2 or the latest version of QTI. A decision was taken early in the development of CC to use older, more widely implemented versions of specifications rather than the latest versions. (It should be noted that this looks like changing as more demands are being made on CC which the newer versions of the specs can achieve.)

So, the day was also an opportunity to reflect on what the current state of play is with IMS and other specification bodies, and to discuss with the community what areas they feel are most important for CETIS to be engaging in. Profiling did surface as something that the JISC elearning development community – particularly in the assessment domain – should be developing further.

In terms of specification updates, our host Steve Lay presented a brief history of QTI and future development plans, Adam Cooper (CETIS) gave a round up from the IMS Quarterly meeting held the week before and Wilbert Kraan (CETIS) gave a round up of packaging developments including non IMS initiatives such as OAI-ORE and IEEE RAMLET. On the implementation side of things Ross MacKenzie and Sarah Wood (OU) took us through their experiences of developing common cartridges for the OpenLearn project and Niall Barr (NB Software) gave an overview of integrating QTI and common cartridge. There was also a very stimulating presentation from Linn van der Zanden (SQA) on a pilot project using wikis and blogs as assessment tools.

Presentations/slidecasts ( including as much discussion as was audible) and MP3s are available from the wiki so if you want to get up to speed on what is happening in the wonderful world of specifications – have a listen. There is also an excellent review of the day over on Rowin’s blog.