Â鶹¹ÙÍøÊ×Ò³Èë¿Ú

« Previous | Main | Next »

Dual Screen Experiences: the Secret Fortune Pilot and the Sync API

Post categories: ,Ìý

Steve Jolly Steve Jolly | 10:30 UK time, Tuesday, 18 October 2011

Our team here at Â鶹¹ÙÍøÊ×Ò³Èë¿Ú R&D has spent the last couple of years working on ways to take advantage of of the increasing number of smartphones, tablets and laptops in the homes of Â鶹¹ÙÍøÊ×Ò³Èë¿Ú audience members to enhance our TV and radio programmes and the way people interact with them. I've blogged previously about this area of work, including the Universal Control API, and the ways in which it could replace remote controls and enable what my colleague Jerry Kramskoy has called "orchestrated media" experiences, which consist of media presented on multiple devices that are synchronised to one another.

In that earlier blog post, I mention that technologies exist that are already being used to synchronise media on mobile devices to television programmes, such as and watermarking, and watermarking and delivering synchronisation information via the Internet. The advantage of all these solutions is that they require no modifications to the set-top box or television. A common disadvantage is that content on other devices can only follow what happens on the television, and not vice-versa. In the longer term, we believe that a technology like Universal Control offers very significant advantages in this regard, but we recently took advantage of an opportunity to work with colleagues from across the Â鶹¹ÙÍøÊ×Ò³Èë¿Ú to investigate some of these existing methods of synchronisation, to see what kinds of "dual screen" experience might be possible today.

Our most significant contribution to the work has been an API to permit the developers of dual-screen applications to ignore the details of specific synchronisation technologies. It provides a standard interface, behind which any number of information sources may be working (individually or together) to provide the application with information about what programme the user is watching (if any), and what events are occurring in it that might trigger synchronised behaviour. This approach helps the Â鶹¹ÙÍøÊ×Ò³Èë¿Ú avoid getting locked into using the technology of a specific supplier, and helps "future-proof" applications: as new synchronisation technologies become available, little or no extra effort is likely to be required for existing applications to be able to make use of them. (Of course, one of the sources of information could be a Universal Control server on the set-top box...)

A library implementing the sync API can integrated into Â鶹¹ÙÍøÊ×Ò³Èë¿Ú apps and websites to provide them with event triggers and programme identification information from multiple possible sources.  At present, those sources are the audio stream of a television programme or the Internet, but others are anticipated in the future.  The library also has an internal timer that can trigger events at fixed intervals after those received from external sources.

A high-level overview of the sync API. A library implementing the API can be embedded into Â鶹¹ÙÍøÊ×Ò³Èë¿Ú apps and websites to provide them with information about the programme a user is watching on their television and trigger events at predefined points within that programme.

This autumn, we've been testing our API design as part of a closed Â鶹¹ÙÍøÊ×Ò³Èë¿Ú pilot accompanying the National Lottery show Secret Fortune. Up to two hundred viewers have been taking part, playing along with the quiz on their smartphone or tablet devices to see whether they can do better than the people playing the game in the studio.

Although we have learned many things from the pilot about the practicalities of this kind of dual screen experience and how they can improve the programmes they accompany, from our R&D perspective the most important lessons have been regarding the performance of the synchronisation technologies and the usefulness of our API. For this pilot, we chose to test two technologies: an audio watermarking system from a commercial supplier, and an -based system for delivering synchronisation information via the Internet that we developed ourselves, using , and written by Duncan Robertson of the Â鶹¹ÙÍøÊ×Ò³Èë¿Ú R&D Prototyping team.

A mobile phone showing a question screen from the Secret Fortune companion application.  In the background, a television showing the programme can be seen.

The Secret Fortune dual-screen application in action.

Even just from this technical perspective, we learned a number of useful things from the pilot. For example, audio watermarks take a finite amount of time to detect, which prevented us from implementing any watermark-driven synchronised behaviour at the very beginning of a programme (or programme segment). Also, while it is obvious that signalling events in programmes from a central server on the Internet can only work for people watching the programme as it is broadcast, it is perhaps less obvious that people watching on different broadcast platforms (eg Freeview, Freesat or analogue TV) see a given part of the programme at slightly different times, which reduces the synchronisation accuracy accordingly.

From the perspective of our own work, we have been very happy with the performance of our prototype API implementation. The team developing the play-along smartphone and tablet app found it extremely straightforward to integrate, and the ability to turn synchronisation sources on and off by changing a web-based configuration file has been critical to testing those different sources without requiring members of the pilot to install new versions each week.

In addition, by using the abstraction inherent to the API to decouple the times at which "events" (such as the start of a question) are triggered from the times at which the audio watermarks are detected or the XMPP messages are received, we have been able to improve the robustness of the experience (by allowing an event to be triggered even if certain watermarks or messages are missed). At the same time, this gave us the opportunity to fine-tune the timing of events right up to the point of broadcast.

Our existing API is inherently asymmetric: it only delivers event information in one direction: from the television to the mobile device. Over the course of the next few months we will be identifying technologies, protocols or new APIs that could bring more 'symmetric' experiences to the user - look for more blog posts on this subject in the next few months. We will also be working on documenting our work and the knowledge we've gained from the pilot - just in case the Â鶹¹ÙÍøÊ×Ò³Èë¿Ú decides to take this idea further...

Comments

  • Comment number 1.

    Hey Steve,

    Great article!! We too have been working on this real-time, bi-directional second screen experience tied to broadcast events. Technology is moving extremely fast in this area - its fantastic to watch the progress.

    I can't wait to read more about further advances you have made with your API. We talked about that in the past, however, you and the Â鶹¹ÙÍøÊ×Ò³Èë¿Ú have a different perspective to a private organization such as ours.

    PS. you can see our system at work on www.HyperFantasyFootball.com

Ìý

More from this blog...

Â鶹¹ÙÍøÊ×Ò³Èë¿Ú iD

Â鶹¹ÙÍøÊ×Ò³Èë¿Ú navigation

Â鶹¹ÙÍøÊ×Ò³Èë¿Ú © 2014 The Â鶹¹ÙÍøÊ×Ò³Èë¿Ú is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.