Twitter Captioning

As part of our support for the remote audience at UKOLN’s recent IWMW 2010 event we provided a live video stream of the plenary talks.  The videos have now been uploaded to the Vimeo video streaming service.  In addition we have made use of the Twitter captioning tool developed by Martin Hawksey based on an idea originally suggested by Tony Hirst.

Since  it was launched there have been a number of developments to the Twitter captioning service, which, incidentally, is described in Wikipedia.  I was particularly pleased at the developments Martin implemented which we could use during the IWMW 2010 event.  At the event Owen Stephens processed the video of the welcome talk I made on the opening and from its original file format and uploaded it to Vimeo. Right at the very end of the closing session Owen demonstrated how the talk had been captioned using the tweets with the event hastag.   Owen also demonstrated the search capability which Martin had developed, which allows the user to search the Twitter stream. Once a search term has been selected the video will jump directly to the appropriate point in the video.  This provides us with an example of a mashup of a Twitter stream and a video with a crowd-sourced bookmaking capability. I think this is very impressive!

Twitter Captioning of IWMW 2010 Talks

Twitter captioning for video of talk on HTML5Most of the other plenary talks have now been processed in a similar fashion. This technology prototype does require a HTML5 browser which can display MP4 videos (such as Safari, Chrome or IE with the Chrome Frame).  If you do not have access to such browsers the accompanying image illustrates how the service works.  The Twitter stream is synchronised with the video and displayed as a caption overlaying the video, as shown.  The full Twitter stream for the period is displayed beneath the video. A search box allows the user to search the Twitter stream. In the image shown I have searching for “validation” and the hits are immediately displayed. Clicking on one of the hits will display the appropriate point in the video.

If you have access to one of these browsers you can gain a better understanding of the capabilities of the service by viewing the following captioned videos:

Note that in the session on “Doing the day job” the talk on “Replacement CMS – Getting it right and getting the buy-in” has a captioned video and a very brief captioned video of the talk on “StudentNET Portal” is also available but due to technical problems a video of the talk was not  produced.

Building on Twapper Keeper

The Twitter captioning service processes the Twitter archive provided by the Twapper Keeper twitter archiving service.  Recently developments to Twapper Keeper have been funded by the JISC (and I am the project manager for this work).  I was particularly pleased with this example which illustrates the benefits that can be gained by providing APIs to a service (such as Twapper Keeper) which can then be exploited by other applications (such as the Twitter captioning service).  This approach allows John O’Brien, the TwapperKeeper developer to focus on backend developments (e.g. migrating the authentication to OAuth)  whilst Martin Hawksey, developer of Twitter captioning service, can focus on enhancements to the end user service (such as the search function).

Advice to Others

If you are thinking of doing something similar for your event here are some suggestions.

Creating the Twitter Stream

  • Create a Twapper Keeper archive for your event hashtag.
  • Consider providing an official event Twitterer who can ensure that the key points in the talks are recorded.
  • Provide clear ways of identifying the start and end of the talks. For example we used the hashtag #P1 for the first plenary talk.  Tweeting “#P1 #start of talk by Chris Sexton” and “#P1 #end of talk by Chris Sexton” enables the start and end of the talks to be easily identified, and this syntax is also understandable by people reading the Twitter stream.

Synchronising the Video and the Twitter Stream

  • The service uses GMT so if BST is in operation (as was the case during the IWMW 2010 event) you will need to bear this in mind when providing the time of the start and finish of the talks.
  • You can fine-tune the time to ensure that you include the official tweets which provide the time stamps.

What Next?

In a recent blog post on “The Backchannel” Chris Sexton described how she valued the way in which twitter could be used to provide feedback when she gives talks at events such as IWMW 2010. Chris added that “You do wonder sometimes, what you’ve said to provoke this, halfway through the talk:  ‘Odd feeling. In one moment inspired. Then, deflated’ “.  Chris need wonder no longer: she can simply search for “Odd feeling” and then skip back to see what she said which perhaps provoked that remark (although, of course, the remark may have been made in response to another tweet, a comment received  via email or even a real world event).

But what else could be done to make this service even better? Some suggestions from me:

  • Search across a group of related captionsed videos (e.g. all videos from IWMW 2010).
  • A RESTful interface, so I can provide a URL for the portion in the video when a user tweeted “Odd feeling“.
  • Support for legacy browsers, so that captioned videos with search capabilities can be provided in a much greater range of videos (although I recognise that this problem is due to the complexities of video codexes and browser support and can’t be fixed by the Twitter captioning service!) .

I wonder how achievable these suggestions are?And do others have additional suggestions?