Designing for second screens: The Autumnwatch Companion
R&D Prototyping recently teamed up with Autumnwatch, a popular ³ÉÈËÂÛ̳ Nature programme, to test a live second screen concept called the TV Companion. It is a real-time web application that explores how second screens might enhance live factual TV shows with additional web content and external links.
The one-off live experiment ran during a single broadcast with three hundred viewers, which gave us lots of useful feedback. In November Tristan Ferne wrote a blog post that gives an overview of the project. In this post I look at the design considerations raised by the prototype. The following video demonstrates it in action.
In order to see this content you need to have both Javascript enabled and Flash installed. Visit µþµþ°äÌý°Â±ð²ú·É¾±²õ±ð for full instructions
Ìý
The design challenge
Lots of people watch TV while using a PC or mobile to pursue unrelated activities like shopping and chatting on Facebook. A proportion actively searches the web for information relating to what they're watching. Our design challenge was to support the desire for more information and to gather feedback from a significant user group. Autumnwatch has a particularly loyal and active audience and we know that they're already second-screening for related nature content due to the peaks of activity on the website during broadcasts. The show aims to involve the whole family, which also gave us scope to study the shared viewing experience.
Our research interests
By building this prototype we aimed to answer a number of questions: - How does a second screen impact a viewer's attention and enjoyment of a show? - Can second screens support in-room social interaction? - How much content is appropriate for the second screen, what types of content and presentation are useful? - What effort is required to generate the second screen content? - Are there opportunities for new programme formats?
The interaction model
The Companion delivered content in a linear sequence to remain relevant to the subject matter throughout the show. It was manually controlled from the Bristol studio, where we sat on the night. After the show the Companion morphed into a menu to revisit the content and provided a springboard of web links to explore subjects in more depth.
Gathering feedback
The Companion was built on the findings from two earlier prototypes that tested the concept and delivery technology. The early focus groups confirmed that the audience understood and enjoyed the concept, one participant described it as part of a broadcast continuum, "the TV is step one, this is step two and the website is step three". These users were invited back as our 'advanced' user group along with several hundred others recruited for the purpose. On the night all users were asked to rate the content during the show and complete a questionnaire after the show. The image below shows an example of diagrammatic content and the rating buttons on the top right of the screen.
The Big Issues
Distraction and anxiety
A synchronised service was a novel experience for our test group. People enjoyed the content, but the attention division between screens meant that users felt distracted from the programme and anxious that they were missing out. However, the advanced user feedback suggests that with their previous experience, users feel more at ease with the second screen as an optional enhancement and a useful resource for later.
Signposting the content sequence
Second screens present a new challenge for programme makers. Essentially the addition of other screens means that a single narrative is shared between devices. An integrated second screen programme model will include presenter calls to action and TV overlay graphics. As a small experiment we couldn't influence the action on the TV, so we opted for second screen chapter titles and a message panel to signal the progression of content. The image below shows the sequence from chapter title to content with a message to the next chapter title.
Content creation
Live television is difficult to pin down when topics, story angles and the running order are in flux until the moment of broadcast. Putting a Multiplatform interactive representative inside the core production team is a must when you need to keep in touch with filming events and editorial changes. We accounted for these issues by preparing content generic enough to fit broad themes and detailed enough to appeal to widest audience.
Appropriate types of content
The content in the Companion was designed for universal appeal to satisfy the participants subject knowledge and level of interest. We wanted the content to promote interaction with a playful interface and tactile interactions that would particularly appeal to touch screen users. The feedback from the focus groups and the Autumnwatch production team guided us to design lightweight, glanceable content. Therefore video, sound and long form text were deemed unsuitable in the second screen context.
We were interested to see how much of the existing online content could be reused, which was easy for the species and habitat information drawn from the ³ÉÈËÂÛ̳ Wildlife Finder, but articles and blog posts had to be simplified. This gave us the opportunity to experiment with presentation formats and interactions, including diagrams, image galleries and animations. The diagram below shows our thinking.
Content mockups
We created demo videos to explore the potential for playful interaction. We find video demos an efficient method of investigation, helping us to express our ambition for the user experience that gives solid examples to assess technical feasibility. This video below describes two tactile interactions: a Top Trumps style 'leaderboard', based on presenter Chris Packham's bird ratings, and a user controlled 'action replay' to show an unusual bird behaviour in slow motion.
Ìý
In order to see this content you need to have both Javascript enabled and Flash installed. Visit µþµþ°äÌý°Â±ð²ú·É¾±²õ±ð for full instructions
The leaderboard shown in the video was appreciated for it's tactile qualities, but thought too distracting during a programme.
Findings
The experiment ran successfully, with few technical hitches and was likened to an encyclopedia being opened at the right page. Participants described a closer connection to the show and found that the content was most useful when it reflected their interests or during less engaging moments in show.
92% of the viewers also reported an increased understanding of the topics covered. Invariably some viewers wanted more detail, whereas 60% felt that it was pitched about right. There was a fifty-fifty split between those that chose to view the content during the show and those that would prefer to use the Companion afterwards, with 70% inspired enough to follow the external website links.
From the rating buttons data we saw that the diagrams, location maps, how to guides and photo galleries were the most relevant and enjoyable content, whereas the weather report, presenter biography and habitat information were least rewarding. Interestingly the presenter's reference to Whooper Swans at the end of the show resulted in the most visited external link after the show, demonstrating the power of on-air cues.
Recommendations
Self contained pages
The Companion is intended to be a hand held source of glanceable information at a comfortable distance. However the shift of focal distance between the screens did result in eyestrain for some users.
From our design and layout principles we suggest designing for short pages, to remove scrolling, and the inclusion of simple interactions and navigation to give the user control. Full screen views can be expected on devices like an iPad, where it's typical for one application to be visible at the time. This affords a definable space in which to design cohesive layouts without additional screen furniture to distract from the content. Our prototype ran in a browser window that required the user to stay on the same page throughout the show. This 'locking out' from other activities was negatively received, but as a native app this would not have been perceived as an issue.
Suitable navigation
The debate between a broadcaster directed and user determined content navigation isn't resolved here, as this depends on the genre and role of the second screen. The mixed feedback about the time allowed to view the Companion content could be resolved by allowing the user to roam between the content more freely. No matter which model is employed, the need for a stronger connection between the action on the TV and second screen device is clear. If and when a system like this becomes an everyday service then the presence of audiovisual cues on the TV will vastly improve the issues around content awareness and distraction.
It's all about timing
The timing of content is critical and further work is required to refine a synchronised model. The Companion served new content every three to five minutes. Two thirds of our respondents felt the speed was about right, whereas a quarter felt it was too slow. Second screen applications should balance the frequency of updates to retain the user's interest, but not overwhelm them. Our rule of thumb: don't distract the user when there's drama on the TV. It's better all round to wait for the quiet points and natural breaks in the show when the second screen can be used to retain viewers who might need the additional stimulation.
Make content relevant
Responding to the organic TV production process, especially for a live show was a tricky process and led us to design content that was broadly inline with the show's themes. Content felt more appropriate as a natural extension to the moment, such as an image gallery following an on-air plug, or when offering different information to that on the main screen.
The potential for second screen content to be integral to the main programme will require a joined-up production workflow with a balance between bespoke and automated selection of existing content. For real-world practicality, content could be dynamically aggregated from production system metadata. And of course data based on the viewer's location or interests would make the experience even more personal.
Added benefits of second screen content
After the live experiment the Companion has been developed further as a video on demand second screen demo using R&D's Universal Controller API with the Companion content. This allows the user to skip between chapters within the video via an interactive timeline on the second screen, as well as displaying the Companion content. This model allows the viewer to consume both screens at his or her own pace. Second screen services show potential to attract smartphone users on the look out for rich media experiences that mirror their current multitasking behaviour. And finally the family participants said that the Companion encouraged quality interactions between family members.
The Autumnwatch website
The Autumnwatch production team has also expressed an interest in how a Companion type experience could lend itself to be a contextually aware homepage, changing states for broadcast, post-show buzz and catch-up. Furthermore the second screen content, when absorbed within the website, can offer routes back to its originating broadcast.
Conclusions
The Companion experiment created a working example of a second screen service for factual TV and shows that complimentary multi screen services can add to the viewer's enjoyment. Understanding of how this content is presented and the appropriate level of user control will develop with the medium. Next up for R&D, we'll be exploring how to create a more sustainable and automated content model for second screen.
Comment number 1.
At 20th Apr 2011, Martin Ericsson wrote:This sounds very interesting, however unfortunately it's not possible to view the videos since they are "Not available in your area" (my particular area being Sweden). Would it be possible for you to change the geo-blocking on these videos? Thanks!
Complain about this comment (Comment number 1)