Thursday 18 October 2012

New Scientist takes note of our prototype for User Generated Content

Our colleagues at CWI, who were part of the development team on our prototype system for automatic compilation of videos from user-generated content, chatted recently with New Scientist magazine to tell them a little more about it. Here's the article - http://www.newscientist.com/article/mg21528835.800-video-mashups-give-you-personalised-memories.html


The narrative engine developed by the Narrative and Interactive Media (NIM) group at Goldsmiths works with the help of an authored symbolic representation of the story surrounding a social event. In the case of our prototype this was a school concert. Users, via a web-based app, were able to interact with a personalised movie created from all the video footage shot and uploaded by all the friends and family who witnessed and captured media at the concert.

I am about to present a paper - 'Interactive Video Stories from User Generated Content: a School Concert Use Case' -  about building the algorithm for the system at the upcoming International Conference on Digital Storytelling (ICIDS) in San Sebastian, Nov 12-15. In the paper I go into more details about how the notion of a 'sequence' was borrowed from existing TV editing practice and used to create a flexible computer based story model based on a previous NIM development, Narrative Structure Language (NSL). The last link takes you through to a publication in the ACM Multimedia Systems Journal about NSL.

No comments:

Post a Comment