Research & Development

Posted by Andy Brown, Jake Patterson on , last updated

Here at BBC R&D we have been exploring the possibilities of immersive forms of content including 360° video and virtual reality (VR) where the viewer can look in any direction. We have been doing research into how people direct their attention and how to overlay information whilst working with production to understand what types of content work well in this new medium. However, 360° and VR content throw up challenges in terms of accessibility. In conventional television we can assume that people are looking directly at a TV screen in front of them and the subtitles (closed captions) are displayed within the image. However, what do we do with the subtitles in a 360° experience if the viewer can be looking in any direction?

We have been running user tests to try and find some initial answers to this question. We have been trying out four different ways of placing the subtitles in the scene, each of which we anticipate to have both strengths and weaknesses:

  • Subtitles are placed into the scene in three fixed positions, equally spaced by 120° around the video and slightly below the eye line.  This technique is very easy to implement – indeed, it can be “burnt-in” to the video – and the subtitles will be in a known location and visible when viewing anywhere.  However, the text may not be in the optimum position to read.
  • The subtitles are presented as a 'head-up display' always in front of you, and slightly below straight ahead. As you turn your head, the subtitle moves with you, always at the same location in the headset display.  This technique ensures that the subtitle is always visible and easy to locate, but potentially covers up an important part of the scene.  Anecdotally, the ‘head-up display’ style is said to be uncomfortable, and potentially linked with virtual reality sickness (although our initial results don’t seem to support this).
  • The subtitles follow your gaze around, but only for larger head movements: if you look slightly left or right it stays in place, but looking further will cause the subtitle to catch up with your head orientation.  This is a similar approach to above, with a lag that is intended to remove the feeling that the text is “fixed to your face”.  Potential disadvantages are that users may find the lag unpredictable, and not be able to read the text very easily while it catches up.
  • Each subtitle is placed in the scene in the direction you are looking at the time when it appears and remains fixed in that location in the scene until it disappears.  This approach means that subtitles can read quickly then easily disregarded, although they may constrain viewing the scene when looking around and listening to a description.

BBC R&D - Enhancing 360 Video with Graphics in the Large Hadron Collider

BBC R&D - Enhancing Subtitles

Our aim is to find out how well each technique works for subtitle users, and whether the benefits and problems they identify match our expected advantages and disadvantages.  While we are also interested in exploring how we can position subtitles according to who is speaking, this would have introduced too many variables to test at one time.

Scene from a VR experience with subtitles

Our tests were built using the Unreal games engine, which plays the video through an Oculus Rift headset. It can detect where the viewer is looking, and displays the subtitles according to one of the four behaviours. Our test footage consisted of a set of short 360° video clips to cover a range of scenarios; from a Planet Earth video of the Arizona desert with a narrator out of shot, to scenes with one or two speakers in view along with other items of interest.

An image of VR user testing being conducted in the R&D lab.

We recruited 24 people who habitually use subtitles to watch television to come into the lab and give us their opinions. Each person viewed the set of four videos, each with a different subtitle behaviour. The subtitle behaviours were rotated around the videos to balance the tests. We asked them about their experience of each video, how easy they found it to find and read the subtitle and to follow the clip.

We are now analysing the results and are looking to see if there are one or two of these subtitle behaviours that work for most or all participants. We will also be looking to identify any improvements that can be made, either to the appearance or the behaviour of the subtitles. For our next steps, we hope to make a subtitled clip available on BBC Taster, to get feedback from a wider audience on a longer clip, and to explore how we implement our behaviour of choice so that all BBC 360° content can potentially be offered with subtitles.

Update, Oct 2017: We have now written about the results of our User Testing.

Tweet This - Share on Facebook

More on Virtual Reality and 360° Video:

BBC Connected Studio - Watch 20 minute talks from our experts on Virtual Reality and 360 Video from #BBCVR day

BBC R&D - 8 Tips for Producing VR Projects

BBC R&D - Factual Storytelling Tips for 360 Video

BBC News Labs - 5 Lessons in VR

YouTube - 360 Video from BBC R&D

BBC R&D - Enhancing 360 Video with Graphics in the Large Hadron Collider

BBC R&D - Unearthed - Interactive 360 Sound and Video in a Web Browser

BBC R&D - Why is BBC R&D interested in Virtual Reality?

About the BBC - Exploring VR and immersive video

BBC R&D - Virtual Reality Sound in the Turning Forest

BBC R&D - A Virtual Reality Fairy Tale Premiered at Tribecca Film Festival

Topics