Thursday 27 September 2012

Exploring orchestrated performance @ University College Falmouth


We recently went down to visit the excellent facilities at University College Falmouth (UCF) as part of our ongoing work in the VConect project. The University and its research wing are just one of the institutions benefiting from the role out of super fast broadband in Cornwall, currently being achieved with the help of BT. Such an infrastructure of course lends itself well to requirements of the VConect project and its vision of incorporating high quality, intelligent video into social networking experiences.

You may be familiar with some of our work on orchestration in TA2. The Vconect system is building on the research done in that previous project about video conferencing for the home with the benefit of two different use case – a socialisation use case based on computers in the home and a performance use case based on creating linked performance spaces using video mediated communication. UCF are exploring the requirements of such a platform using their existing infrastructure with the help of dance and acting student from their courses.

When we visited we were treated to a demonstration of work in progress. This particular experiment was looking at the possibility of creating the impression of being ‘side by side’ in the same space. The future role of orchestration is going to be on how to best use a single/multiple screens and projectors with multiple camera feeds available. The challenge will become even greater towards the end of the project as the two paths – performance and home – become merged. How best to represent a mediated performance such as this to an audience of many people sat at their computers at home?

Wednesday 5 September 2012

NIM’s work in the recently completed TA2 project was featured in The Register.

In TA2, we hypothesised that the use of multiple cameras in each location could facilitate such communication experiences. However, this generated a new problem: how would content from multiple cameras be shown on each particular screen? One answer is to have an automatic decision making process that is aware of the conversation flow and is able to represent it by controlling the cameras and mixing their content onto the screen of each location. This is similar to what a TV director would do to cover a live event, but more complex, as each room effectively needs its own dedicated director choosing from the shots available in the other rooms. We called this process orchestration. The overriding assumption was that, should we be able to build an intelligent system able of orchestration, then a vibrant and engaging representation of the shared activity would be achieved on each screen, finally leading to the participants being immersed in the communication experience, unaware of the technology and devices trough which it takes place.

The Register seemed to agree describing our system as being a bit like Minority Report.
To find out more about our research into Orchestration check out our latest public deliverable on the subject.