Design Bash: moving towards learning design interoperability

Question: How do you get a group of projects with a common overarching goal, but with disparate outputs to share outputs? Answer: Hold a design bash. . .

Codebashes and CETIS are quite synonymous now and they have proved to be an effective way for our community to feedback into specification bodies and increase our own knowledge of how specs actually need to be implemented to allow interoperability. So, we decided that with a few modifications, the general codebash approach would be a great way for the current JISC Design for Learning Programme projects to share their outputs and start to get to grips with the many levels of interoperability the varied outputs of the programme present.

To prepare for the day the projects were asked to submit resources which fitted into four broad categories (tools, guidelines/resources, inspirational designs and runnable designs). These resources were tagged into the programmes’ site and using the DFL SUM (see Wilbert’s blog for more information on that) we were able to aggregrate resources and use rss feeds to pull them into the programme wiki. Over 60 resources were submitted, offering a great snapshot of the huge level activity within the programme.

One of the main differences between the design bash and the more established codebashes was the fact that there wasn’t really much code to bash. So we outlined three broad areas of interoperability to help begin conversations between projects. These were:
* conceptual interoperability: the two designs or design systems won’t work together because they make very different assumptions about the learning process, or are aimed at different parts of the process;
* semantic interoperability: the two designs or design systems won’t work together because they provide or expect functionality that the other doesn’t have. E.g. a learning design that calls for a shared whiteboard presented to a design system that doesn’t have such a service;
* syntactic interoperability:the two designs or design systems won’t work together because required or expected functionality is expressed in a format that is not understood by the other.

So did it work? Well in a word yes. As the programme was exploring general issues around designing for learning and not just looking at for example the IMS LD specification there wasn’t as much ‘hard’ interoperability evidence as one would expect from a codebash. However there were many levels of discussions between projects. It would be nigh on impossible to convey the depth and range of discussions in this article, but using the three broad categories above, I’ll try and summarize some of the emerging issues.

In terms of conceptual interoperability one of the main discussion points was the role of context in designing for learning. Was the influence coming from bottom up or top down? This has a clear effect on the way projects have been working and the tools they are using and outcomes produced. Also in some cases the tools sometimes didn’t really fit with the pedagogical concepts of some projects which led to a discussion around the need to start facilitating student design tools -what would these tools look like/work?

In terms of semantic interoperability there were wide ranging discussions around the levels of granularity of designs from the self contained learning object level to the issues of extending and embellishing designs created in LAMS by using IMS LD and tools such as Reload and SLeD.

At the syntactic level there were a number of discussions not just around the more obvious interoperability issues between systems such as LAMS and Reload, but also around the use of wikis and how best to access and share resources It was good to hear that some of the projects are now thinking of looking at the programme SUM as a possible way to access and share resources. There was also a lot of discussion around the incorporation of course description specifications such as XCRI into the pedagogic planner tools.

Overall a number of key issues were teased out over the day, with lots of firm commitment shown by all the projects to continue to work together and increase all levels of interoperability. There was also the acknowledgement that these discussions cannot take place in a vacuum and we need to connect with the rest of the learning design community. This is something which the CETIS support project will continue during the coming months.

More information about the Design Bash and the programme in general can be found on the programme support wiki.

Winning Learning Objects Online

The winners of the 2007 ALT-C Learning Object Competition are now available to view from the Intrallect website.

The winners are:

    *1st prize – All in a day’s work (Colin Paton, Social Care Institute for Excellence, Michael Preston-Shoot, University of Luton, Suzy Braye, University of Sussex and CIMEX Media Ltd)
    *2nd Prize – Need, Supply and Demand (Stephen Allan and Steven Oliver, IVIMEDS)
    *3rd Prize – Enzyme Inhibition and Mendelian Genetics (Kaska Hempel, Jillian Hill, Chris Milne, Lynne Robertson, Susan Woodger, Stuart Nicol, Jon Jack, Academic Reviewers, CeLLS project, Dundee University, Napier University, Interactive University and Scottish Colleges Biotechnology Consortium)

Shortlised Entires (in no particular order):

    *Photographic composition (David Bryson, University of Derby)
    *Human Capital Theory (Barry Richards, Dr. Joanna Cullinane, Catherine Naamani, University of Glamorgan)
    *Tupulo Array Manipulation
    (Tupulo project team at Dublin City University, Institute of Technology Tallaght, Institute of Technology Blanchardstown, Ireland, System Centros de Formacion, Spain, Societatea Romania pentru Educatie Permanenta, Romania)
    *Introduction to Pixel Group Processing (Peter McKenna, Manchester Metropolitan University).

Content is infrastructure – lastest in Terra Incognita series

David Wiley is the current contributor to the excellent Terra Incognita series on Open Source Software and Open Educational Resources on Education. In his article, titled ‘Content is Infrastructure’, David puts forward the somewhat controversial view that for any experimentation to take place within education systems: “we must deploy a sufficient amount of content, on a sufficient number of topics, at a sufficient level of quality, available at sufficiently low cost”. Only then will we be able to “realize that content is infrastructure in order to more clearly understand that the eventual creation of a content infrastructure which is free to use will catalyze and support the types of experiments and innovations we hope to see in the educational realm.

I feel this is a very timely article, refocusing on the role of content and content related services with the education sector. It does seem to me that a the role of content is often over-looked, particularly in our (UK) HE sector and that there is a somewhat pervasive ‘been there done that’ attitude. But as David points out, if we are to fully reap the potential rewards of open content initiatives then we really need start looking at content as being as an infrastructure on which we can build and experiment with.

There have already been a number of comments (and replies) to David’s post all of which available from the Terra Incognita blog.