Socially favoured projects, real measures of engagement?

Martin Hawksey has been doing a bit of playing around with JISC project data lately and has now created a spreadsheet of the top “socially favoured” JISC funded projects.

As a large part of my job involves supporting and amplifying the work of JISC programmes, I’m also always looking for ways to keep in touch with projects between official programme meetings and feedback on reports. Over the past few years, I have personally found that twitter has been quite revolutionary in that regard. It gives me a flexible ‘lite” way to build relationships, monitor and share project developments. I’ve also noted how twitter is becoming a key dissemination tool for projects and indeed programmes. So I was fascinated to see Martin’s table and what sources he had used.

Like many others I’m becoming increasingly interested in the numerous ways that social services such as facebook, twitter, google+ etc can be used and analyzed. I’ve got my peer-index, checked out my klout – even this morning I had to have a look at twtrland to see what that service made of me. But I do take all of these with a pinch of salt, they give indication of things but not the whole picture.

For this exercise, Martin has used several sources of data including twitter, facebook, linked-in, google+, buzz, digg, delicious, stumbleupon. (See Martin’s post on how he did it). A number of things struck me on first looking at the spreadsheet. The top projects seemed to be related to “big” collections and repository focused. There wasn’t a lot from the teaching and learning side of things till around the mid 20s the Open Spires project, again though this is very much a content related project. Also the top projects all had high scores on the bookmarking sites. Facebook and Linked-In use seemed to be limited, but again the top projects all had relatively high scores. Twitter seemed to be the most consistently used service across the board. And perhaps most striking, after the top twenty or so use of all the services decreases dramatically.

So what does this all mean? Is the fact that the top ranked projects have high bookmarking scores mean that the projects actively encourage sharing in this way – or is it down to the already web-savvy habits of their users? Checking the first couple of projects, it’s hard to tell. The first 2 don’t have any obvious links/buttons to any of the “ranked services”, but the 3rd one has a google sharing app on its front page, and others have obvious links to facebook, twitter etc. I think there would almost need to be a follow up mini-report from each project on their assessment of the impact of these services to start to be able to make any informed comment. What impact does using social services have on sustainability? Does having a facebook page make a project more likely to maintain an up to date web site as per grant funding (see Martin’s post on this too)? Another point of note is that the links for a number of the top ranked projects go to generic and not project specific websites.

I’m not sure I’ve come to any conclusions about this, as with any data collection exercise it has raised more questions that it has answered, and the ranking it provides can’t be judged in isolation. For me, it would be interesting enhance the data to identify what programmes the projects have been funded from and then start to explore the evidence around the effectiveness of each of the social channels. However, it is fascinating to see another example of the different ways people can now start pulling “social” statistics together. Thanks Martin!

6 thoughts on “Socially favoured projects, real measures of engagement?”

  1. Hi Sheila – like you I ended up wondering ‘So what does this all mean?’ and concluded that with this set of data nothing really concrete. Tracking a single project url give you very little information about the impact the projects are having overall.

    In PostRank Analytics, which collects social shares and more, it’s interesting how they put a suggested $ value on engagement weighting comments, like/tweet and bookmarks differently. For example, in PostRank terms this comment is worth 10 ‘engagement points’ and the tweet I did 7 points. At a suggested $0.25/pt that’s $4.25 I’ve earned you 😉

    If you total all the social engagement actions recorded in the spreadsheet the project home pages have generated over $30k in ‘engagement value’. With the focus on value for money I wonder if this type of data will be used to evidence benefit.

    If anyone is interested in how I generated the spreadsheet this post http://mashe.hawksey.info/2011/08/and-the-most-engaging-jisc-project-is/ has more detail.

    Thanks for sharing my work

    Martin

  2. Ah the “show me the money” bit, that is interesting; as you say will be interesting to see if could be for evidence. Actually might be more useful for a service like CETIS as another measure of (cost) effective networking for JISC.

Leave a Reply

Your email address will not be published. Required fields are marked *

css.php