@jisccetis – how others see us

Those of you who follow the @jisccetis twitter account will have probably noticed that it is very much a broadcast channel, used to send updates on our latest news, features and events. Those of you who don’t , or no longer follow the account, will probably have noticed that too, and that’s why you don’t /no longer follow it. Over the coming weeks we’re going to be making some subtle changes, and this post will try to outline our rationale.

As I’ve commented before, use of the @jisccetis account has evolved more by accident than by design or any kind of strategic planning other than “we should have one of those twitter accounts shouldn’t we”. Partly this is due to the number/ and activity of individual cetis staff on twitter; partly due to resource issues. We have been quite content with the neutral (or perhaps more accurate, silent) voice of the account, and not following back. We don’t see it as the “face” of Cetis, or want to spend time developing a corporate personality. However it’s always good to get some feedback.

Now Martin Hawksey is part of the Cetis team, we’ve been having some really interesting conversations around the @jisccetis twitter account – backed up of course by some of Martin’s google spreadsheet analytics. So whilst we’ve been content with our automatic workflow, from the other side it’s not always that useful for others. As Martin pointed out, he’s un-followed and re-followed the account several times, mainly because there needs to be some reciprocation. Why follow an account that doesn’t follow you back? What do you get out of that? It’s not great for your vanalytics.

So, change number 1, @jisccetis is following people now. Armed now with our found new twitter and google intelligence, we have a much clearer idea of who is retweeting our posts and driving traffic to our blogs and generally sharing our “stuff”. A pause here to say a big thank you to @nopiedra, @sarahknight, and @drdjwalker! Change number 2, we’re going to make concerted effort to say thank you to people more often for sharing our work.

@jisccetis isn’t about to becoming a coffee drinking, weather sharing kind of account tho’ 😉 It will still be primarily informational, but we are now also going to start listing followers where possible, and hopefully make the @jisccetis twitter page a bit more useful too.

We’re not concerned with measuring our twitter activity based on solely on increasing numbers – we don’t need half a million followers. What we are concerned with is ensuring that we are engaging with key members of our community, and also discovering what other communities we are tapping into, or not tapping into. Martin’s spreadsheets, and the announcement from Google about integrating social network data into google analytics earlier this week will invaluable for us evolve our approach to using twitter more effectively both for us and our followers. We’ll also be sharing our experiences, and our thoughts on using the data we’re collecting over the coming months.

Curriculum Design Technical Journeys: Part 1

This is the first of a series of posts summarizing the technical aspects of the JISC
Curriculum Design Programme, based on a series of discussions between CETIS and the projects. These yearly discussions have been annotated and recorded in our PROD database.

The programme is well into its final year with projects due to finish at the end of July 2012. Instead of a final report, the projects are being asked to submit a more narrative institutional story of their experiences. As with any long running programme, in this instance, four years, a lot has changed since the projects started both within institutions themselves and in the wider political context the UK HE sector now finds itself.

At the beginning of the programme, the projects were put into clusters based on three high level concepts they (and indeed the programme) were trying to address

• Business processes – Cluster A
• Organisational change – Cluster B
• Educational principles/curriculum design practices – Cluster C

I felt that it would be useful to summarize my final thoughts or my view of overall technical journey of the programme – this maybe a mini epic! This post will focus on the Cluster C projects, OULDI (OU), PiP (University of Strathclyde) and Viewpoints (University of Ulster). These projects all started with explicit drivers based on educational principles and curriculum design practices.

OULDI (Open University Learning Design Initiative)
*Project Prod Entry
The OULDI project, has been working towards “ . . .develop and implement a methodology for learning design composed of tools, practice and other innovation that both builds upon, and contributes to, existing academic and practioner research.”

The team have built up an extensive toolkit around the design process for practitioners, including: Course Map template, Pedagogical Features Card Sort, Pedagogy Profiler and Information Literacies Facilitation Cards.

The main technical developments for the project have been the creation of the Cloudworks site and the continued development of theCompendium LD learning design tool.

Cloudworks, and its open source version CloudEngine is one of the major technical outputs for the programme. Originally envisioned as a kind of flickr for learning designs, the site has evolved into something slightly different “a place to share, find and discuss learning and teaching ideas and experiences.” In fact this evolution to a more discursive space has perhaps made it a far more flexible and richer resource. Over the course of the programme we have seen the development from the desire to preview learning designs to last year LAMS sequences being fully embedded in the site; as well as other embedded resources such as video diaries from the teams partners.

The site was originally built in Drupal, however the team made a decision to switch to using Codeigniter. This has given them the flexibility and level control they felt they needed. Juliette Culver has written an excellent blog post about their decision process and experiences.

Making the code open source has also been quite a learning curve for the team which they have been documenting and they plan to produce at least one more post aimed at developers around some of the practical lessons they have learned. Use of Cloudworks has been growing, however take up of the open-source version hasn’t been quite as popular an option. I speculated with the team that perhaps it was simply because the original site is so user-friendly that people don’t really see the need to host their own version. However I think that having the code available as open source can only be a “good thing”, particularly for a JISC funded project. Perhaps some more work on showing examples of what can be done with the API (e.g. building on the experiments CETIS did for our 2010 Design Bash ) might be a way to encourage more experimentation and integration of parts of the site in other areas, which in turn might led to the bigger step of implementing a stand alone version. That said, sustaining the evolution of Cloudworks is a key issue for the team. In terms of internal institutional sustainability there is now commitment to it and it has being highlighted in various strategy papers particularly around enhancing staff capability.

Compendium LD has also developed over the programme life-cyle. Now PC, Mac and Linux versions are available to download. There is also additional help built into the tool linking to Cloudworks, and a prototype areas for sharing design maps . The source code is also available under a GNU licence. The team have created a set of useful resources including a useful video introduction, and a set of user guides. It’s probably fair to say that Compendium LD is really for “expert designers”, however the team have found the icon set used in the tool really useful in f2f activities around developing design literacies and using them as part of a separate paper-based output.

Viewpoints
*Project Prod Entry

The project focus has focused on the development and facilitation of its set of curriculum re-design workshops. “We aim to create a series of user-friendly reflective tools for staff, promoting and enhancing good curriculum design.”

The Viewpoints process is now formally embedded the institutional course re-validation process. The team are embarking on a round of ‘train the trainer’ workshops to create a network of Viewpoints Champions to cascade throughout the University. A set of workshop resource packs are being developed which will be available via a booking system (for monitoring purposes) through the library for the champions. The team have also shared a number of outputs openly through a variety of channels including delicious , flickr and slideshare.

The project has focused on f2f interactions, and are using now creating video case studies from participants which will be available online over the coming months. The team had originally planned on building an online narration tool to complement (or perhaps even replace) the f2f workshops. However they now feel that the richness of the workshops could not be replaced with an online version. But as luck would have it, the Co-Educate project is developing a widget based on the 8-LEM model, which underpins much of the original work on which Viewpoints evolved, and so the project is discussing ways to input and utilize this development which should be available by June.

Early in the project, the team explored some formal modelling approaches, but found that a lighter weight approach using Balsamiq particularly useful for their needs. It proved to be effective both in terms of rapid prototyping and reducing development time, and getting useful engagement from end users. Balsamiq, and the rapid prototyping approach developed through Viewpoints is now being used widely by the developers in other projects for the institution.

Due to the focus on developing the workshop methodology there hasn’t been as much technical integration as originally envisaged. However, the team has been cognisant of institutional processes and workflows. Throughout the project the team have been keen to enable and build on structured data driven approaches allowing data to be easily re-purposed.

The team are now involved in the restructuring of a default course template area for all courses in their VLE. The template will pull in a variety of information sources from the library, NSS, assignment dates as well as a number of the frameworks and principles (e.g. assessment) developed through the project. So there is a logical progression from the f2f workshop, to course validation documentation, to what the student is presented with. Although the project hasn’t formally used XCRI they are noting growing institutional interest in it and data collection in general.

The team would like to continue with a data driven approach and see the development of their timetabling provision to make it more personalised for students.

PiP (Principles in Patterns)
*Project Prod Entry
The aims of the PiP project are:
(i) develop and test a prototype on-line expert system and linked set of educational resources that, if adopted, would:
¡ improve the efficiency of course and class approval processes at the University of Strathclyde
¡ help stimulate reflection about the educational design of classes and courses and about the student experiences they would promote
¡ support the alignment of course and class provision with institutional policies and strategies

(ii) use the findings from (i) to share lessons learned and to produce a set of recommendations to the University of Strathclyde and to the HE sector about ways of improving class and course approval processes

Unlike OULDI and Viewpoints, this project was less about f2f engagement supporting staff development in terms of course design, and focused on designing and building a system built on educationally proven methodology (e.g. The Reap Project). In terms of technical outputs, in some ways the outputs and experiences of the team actually mirrored more of those from the projects in Cluster B as PiP, like T-SPARC has developed a system based on Sharepoint, and like PALET has used Six Sigma and Lean methodologies.

The team have experimented extensively with a variety of modelling approaches, from UML and BPMN via a quick detour exploring Archi, for their base-lining models to now adopting Visio and the Six Sigma methodology. The real value of modelling is nearly always the conversations the process stimulates, and the team have noticed a perceptible change within the institution around attitudes towards, and the recognition of the importance of understanding and sharing core business processes. The project process workflow diagram is one I know I have found very useful to represent the complexity of course design and approval systems.

The team now have a prototype system, C-CAP, built on Sharepoint which is being trialled at the moment. The team are currently reflecting on the feedback so far via the project blog. This recent post outlines some of the divergent information needs within the course design and approval process. I’m sure many institutions could draw parallels with these thoughts and I’m sure the team would welcome feedback.

In terms of the development of the expert system, they team has had to deal with a number of challenges in terms of the lack of institutional integration between systems. Sharepoint was a common denominator, and so an obvious place to start. However, over the course of the past few years, there has been a re-think about development strategies. Originally it was planned to build the system using a .Net framework approach. Over the past year the decision was made to change to take an InfoPath approach. In terms of sustainability the team see this as being far more effective and hope to see a growing number of power users as apposed to specialist developers, which the .Net approach would have required. The team will be producing a blog post sharing the developers experience of building the system through the InfoPath approach.

Although the team feel they have made inroads around many issues, they do still see issues institutionally particularly around data collection. There is still ambiguity about use of terms such as course, module, programme between faculties. Although there is more interest in data collection in 2012 than in 2008 from senior management, there is still some work to be done around the importance and need for consistency of use.

So from this cluster, a robust set of tools for engaging practitioners with resources to help kick start the (re) design process and a working prototype to move from the paper based resources into formal course approval documentation.

A Conversation Around the Digital University – Part 4

Continuing our discussions (introduction, part 2, part 3) around concepts of a Digital University, in this post we are going to explore the Curriculum and Course Design quadrant of our conceptual model.

To reiterate,the logic of our overall discussion starts with the macro concept of Digital Participation which provides the wider societal backdrop to educational development. Information Literacy enables digital participation and in educational institutions is supported by Learning Environments which are themselves constantly evolving. All of this has significant implications for Curriculum and Course Design.

Observant readers will have noticed that we have “skipped” a quadrant. However this is more down to my lack of writing the learning environment section, and Bill having completed this section first 🙂 However, we hope that this does actually illustrate the iterative and cyclical nature of the model, allowing for multiple entry points.

MacNeill, Johnston Conceptual Matrix, 2012
MacNeill, Johnston Conceptual Matrix, 2012

Curriculum
Participation in university education, digital and otherwise, is normally based on people’s desire to learn by obtaining a degree, channelled in turn by their motivations e.g. school/college influences, improved career prospects, peer behaviour, family ambitions and the general social value ascribed to higher education. This approach includes adult returners taking Access routes, postgraduates and a variety of people taking short courses and accessing other forms of engagement.

All of these diverse factors combine to define the full nature of curriculum in higher education and argue for a holistic view of curriculum embracing “ …content, pedagogy, process, diversity and varied connections to the wider social and economic agendas…” ( Johnston 2010, P111). Such a holistic view fits well to the aspect of participation in our matrix, since it encompasses not only actual participants, but potential participants as befits modern notions of lifelong and life wide learning, whilst also acknowledging the powerful social and political forces that canalize the nature and experience of higher education. These latter forces have been omnipresent over the last 30 years in the near universal assumption that the overriding point of higher education is to provide ‘human capital’ in pursuit of economic growth.

University recruitment and selection procedures are the gateway to participation in degree courses and on admission initiate student transition experiences, for example the First Year Experience (FYE). Under present conditions, with degrees mainly shaped by disciplinary divisions, subject choice is the primary curriculum question posed by universities, with all other motivations and experiences constellated around the associated disciplinary differences in academic traditions, culture, departmental priority, pedagogy and choice of content. Other candidates for inclusion – employability skills, information literacy, even ethics and epistemological development have tended to be clearly subordinate to the power of disciplinary teaching.

Course Design
Despite 30 years of technological changes, the appearance of new disciplines, and mass enrolments, the popular image of a university degree ‘course’ has remained remarkably stable. Viewed from above we might see thousands of people entering buildings (some medieval, some Victorian, some modern), wherein they ‘become’ students, organized into classes/years of study and coming under the tutelage of subject-expert lecturers. Lectures, tutorials and labs, albeit larger and more technologically enhanced, can look much as they would have done in our grandparent’s day. Assuming our grandparents participated of course.

Looking at degrees in this rather superficial way, we could be accused of straying into the territory recently criticized by Michael Gove, whose attacks on ‘Victorian’ classrooms and demands for change and ‘updating’ of learning via computers and computer science have been widely reported and critiqued.

Our contention is that Gove and others like him have fallen into the trap of focussing on some of the contingent, surface features of daily activity in education and mistaken them for a ‘course’. Improvement in this universe is typically assumed to involve adoption of the latest technology linked to more ‘efficient’ practices. John Biggs (2007) has provided a popular alternative account of what constitutes a good university education by coining the notion of ‘constructive alignment’, which combines key general structural elements of a course – learning objectives, teaching methods, assessment practices and overall evaluation – with advocacy of a form of teaching for learning, distilled here as ‘social constructivism’. This form of learning emphasises the necessity of students learning by constructing meaning from their interactions with knowledge, and other learners, as opposed to simply soaking up new information, like so many inert, individual sponges. In this view, improving education is more complex and complicated than any uni-dimensional technological innovation and involves the alignment of all facets of course design in order to entail advanced learning. Debate is often focussed by terms like: active learning; inquiry based learning etc. accompanied by trends such as in-depth research and development of specific course dimensions such as assessment in particular.

Whist one can debate Biggs’ approach, and we assume some of you will, his work has been influential in university educational development, lecturer education and quality enhancement over several decades. From our perspective, his approach is useful in highlighting the critical importance of treating course design (and re-design) as the key strategic unit of analysis, activity and management in improving the higher education curriculum, as opposed to the more popular belief that it is the academic qualifications and classroom behaviour of lecturers or the adoption of particular technologies, for example, which count most. The current JISC funded Institutional Approaches to Curriculum Design Programme is providing another level of insight into the multiple aspects of curriculum design.

Connections & Questions
Chaining back through our model/matrix, we can now assert:

1. That strategic and operational management of learning environment must be a function of course design/re-design and not separate specialist functions within university organizations. This means engaging all stakeholders in the ongoing re-design of all courses to an agreed plan of curriculum renovation.

2. That education for information literacy must be entailed in the learning experiences of all students (and staff) as part of the curriculum and must be grounded in modern views of the field. Which is precisely what JISC is encouraging and supporting through its current Developing Digital Literacies Programme.

3. That participation in all its variety and possibility is a much more significant matter than simple selection/recruitment of suitably qualified people to existing degree course offerings. The nature of a university’s social engagement is exposed by the extent to which the full range of possible engagements and forms of participation are taken into account. For example is a given university’s strategy for participation mainly driven by the human capital/economic growth rationale of higher education, or are there additional/ alternative values enacted?

As ever, we’d appreciate any thoughts, questions and feedback you have in the comments.

*Part 2
*Part 3
*

Learning Analytics, where do you stand?

For? Against? Not bovvered? Don’t understand the question?

The term learning analytics is certainly trending in all the right ways on all the horizons scans. As with many “new” terms there are still some mis-conceptions about what it actually is or perhaps more accurately what it actually encompasses. For example, whilst talking with colleagues from the SURF Foundation earlier this week, they mentioned the “issues around using data to improve student retention” session at the CETIS conference. SURF have just funded a learning analytics programme of work which closely matches many of the examples and issues shared and discussed there. They were quite surprised that the session hadn’t be called “learning analytics”. Student retention is indeed a part of learning analytics, but not the only part.

However, back to my original question and the prompt for it. I’ve just caught up with the presentation Gardner Campbell gave to the LAK12 MOOC last week titled “Here I Stand” in which he presents a very compelling argument against some of the trends which are beginning to emerge in field of learning analytics.

Gardner is concerned that there is a danger of that the more reductive models of analytics may actually force us backwards in our models of teaching and learning. Drawing an analogy between M theory – in particular Stephen Hawkins description of there being not being one M theory but a “family of theories” – and how knowledge and learning actually occur. He is concerned that current learning analytics systems are based too much on “the math” and don’t actually show the human side of learning and the bigger picture of human interaction and knowledge transfer. As he pointed out “student success is not the same as success as a student”.

Some of the rubrics we might be tempted to use to (and in cases already are) build learning analytics systems reduce the educational experience to a simplistic management model. Typically systems are looking for signs pointing to failure, and not for the key moments of success in learning. What we should be working towards are system(s) that are adaptive, allow for reflection and can learn themselves.

This did make me think of the presentation at FOFE11 from IBM about their learning analytics system, which certainly scared the life out of me and many other’s I’ve spoken too. It also raised a lot of questions from the audience (and the twitter backchannel) about the educational value of the experience of failure. At the same time I was reflecting on the whole terminology issue again. Common understandings – why are they so difficult in education? When learning design was the “in thing”, I think it was John Casey who pointed out that what we were actually talking about most of the time was actually “teaching design”. Are we in danger of the same thing happening to the learning side of learning analytics being hi-jacked by narrower, or perhaps to be fairer, more tightly defined management and accountability driven analytics ?

To try and mitigate this we need to ensure that all key stakeholders are starting to ask (and answering) the questions Gardner raised in his presentation. What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom? But before we can do any of that we need to make sure that our stakeholders are informed enough to take a stand, and not just have to accept whatever system they are given.

At CETIS we are about to embark on an analytics landscape study, which we are calling an Analytics Reconnoitre. We are going to look at the field of learning analytics from a holistic perspective, review recent work and (hopefully) produce some pragmatic briefings on the who, where, why, what and when’s of learning analytics and point to useful resources and real world examples. This will build and complement work already funded by JISC such as the Relationship Management Programme, the Business Intelligence Infokit and the Activity Data Programme synthesis. We’ll also be looking to emerging communities of practice, both here in the UK and internationally to join up on thinking and future developments. Hopefully this work will contribute to the growing body of knowledge and experience in the field of learning analytics and well as raising some key questions (and hopefully some answers) around around its many facets.

My open education week

As you are now doubt aware, this week is Open Education Week. I’ve been enjoying following various activities, including some great contributions and thoughts from JISC colleagues. But on a more personal level, I was delighted to get an confirmation email yesterday that the Stanford open course on Natural Language Processing is starting next week.

I signed up for this course, as I thought their first course on AI (see Adam’s interview with Seb Schmoller for more details ) would be beyond my capabilities. I’m also becoming more and more interested in NPL, through conversations with my David Sherlock around techniques for getting more analysis and visualisations from the data in our PROD database; and also from Adam’s recent presentation and blog post around his experiments with Cetis staff blogs.

I’ve just watched the introductory video, which simultaneously excited and slightly scared me as there are weekly programme tasks – looks like I’ll need to catch up on my open code academy lessons too.

css.php