Psycho-physical aspects of VR need to be further evaluated, according to the EBU (photo by Andrew Robles on Unsplash).
The EBU (European Broadcasting Union) has urged the whole video service industry to unite behind Virtual Reality and avoid the mistakes made with previous attempts to deliver new experiences for users, such as 3D.
On the one hand broadcasters and pay TV operators need to respond to the challenge posed by the big Internet players with their big investments in VR but on the other they need to balance costs and challenges against opportunities. The EBU has distilled views and VR deployment plans of member broadcasters in a report just published, Virtual Reality: How are public broadcasters using it?
The report was presented at Observatorio VR in Málaga on 21 July with speakers from some of the Internet players including Google, Samsung, Nokia, Sony Playstation, and HTC Vive. The EBU noted how Facebook has acquired Oculus for example while Google has significant investments with its Daydream and Jump projects, with Samsung, Sony and Nokia also spending heavily. At the same time broadcasters were weighing in with trials and some availability of content.
The greatest early activity has involved 360-degree video which can be seen as a stepping stone towards full VR, with 49% of EBU members having made some such surround content available, a further 9% in trials and 19% at least planning to start doing so. When it comes to full VR, currently 19% of EBU members have made some content available with a further 18% at the trial or planning stage. The third category surveyed by the EBU was Augmented Reality (AR), where 7% of EBU members had made some content available and 21% at the trial or planning stage.
Of course, the EBU had to define these categories in order to sample opinion and struggled a bit over VR itself. It conceded that VR could just be 360° video when the content is primarily video based, but that was confusing given this was regarded as a separate category for the survey. It could also include computer-generated VR (CG), when the content is primarily rendered from a 3D model in real time in the user’s device. It could also be both, while incorporating panoramic 2D or stereoscopic images viewed on head-mounted displays.
The EBU was clearer on AR which it defined as an overlay of CG on images of the real world, but with no interaction between them. The real-world content and the CG content are not able to respond to each other, as the EBU put it. Then just to muddy the waters the EBU discussed Mixed Reality (MR) as also an overlay of synthetic content on the real world, but in this case fully integrated to allow interaction between the CG and real-world images.
One of the biggest challenges for VR is reducing the “motion to photon” latency to 20 ms or less to avoid causing nausea (photo by Samuel Zeller on Unsplash).
There is also the idea of immersive VR, which surely is what VR should be about anyway, creating the illusion of being present in the scene. This will involve audio as well, which the EBU regards as just as important for the experience as video. The report argued that emerging object based video linking sounds with visual objects such as cars or helicopters in the scene was merely a step towards realistic 3D, positional, surround audio. Such VR audio should apply specialization to present sounds from any direction and give users cues where to look next, through a natural listening experience. For this to work properly the ability to track the user's head orientation and position will also be crucial.
This is a way off but meanwhile broadcasters face various challenges to implement VR technology available today, including lack of skills, tools and vision, necessarily short viewing times, no established workflows, technical quality still not good enough, lack of distribution network capacity and unknown ROI (Return on Investment).
One of the biggest specific technical challenges for VR lies in reducing the “motion to photon” latency, which is the amount of time between an input movement such as a head turning and the screen being updated in response to that. Research has shown that a value of less than 20 ms is needed to produce a realistic experience and avoid making users feel sick. Even given a frame rate of 60 Hz the display will be updated only every 17 ms, which is quite close to the limit so that the other processing steps required for updating the display would need to be excessively optimized. Probably frame rate will have to be at least 90 Hz.
Given these challenges, the EBU advocated cautious investment to avoid over commitment at this early stage where the standardization scene was still like the wild west. All the usual bodies have piled in, including the DVB, W3C with WebVR and ITU-R with Advanced Immersive Audio Visual (AIAV) Systems for Programme Production and Exchange. There are also MPEG with the Omnidirectional Media Application Format (OMAF) standard as well as the Media Orchestration Interface (MORE) for video stitching and encoding and JPEG developing JPEG XT (omnidirectional photographs), JPEG XS (low latency compression formats for VR) and JPEG PLENO (light field video format). DASH-IF is planning tests and trials of VR delivery using DASH technology, while IEEE has started to define different categories, levels and formats for immersive video, as well as the functions and interactions enabled by these formats. 3GPP is investigating VR for possible use for 5G and looking at VR standardization for wireless mobile services, while considering delivery of VR video content on both current 3G/4G/LTE and 5G systems. On top of that the Video Quality Experts Group (VQEG) has created an Immersive Media Group (IMG) whose mission is to carry out a "quality assessment of immersive media, including virtual reality, augmented reality, stereoscopic 3DTV, multiview". Finally, Khronos group has announced a VR standards initiative that resulted in OpenXR (CrossPlatform, Portable, Virtual Reality), defining APIs for VR and AR applications.
No wonder service providers and broadcasters are feeling somewhat queasy themselves. Surprisingly the EBU’s report failed to compare the situation with the closely related but more tightly confined area of Ultra HD, which was also once likened to the wild west but where some order has now emerged from the chaos with growing alignment between the Ultra HD Forum working on the infrastructure and UHD Alliance concerned with content and the viewing devices at either end.
The VR world similarly needs two bodies or preferably just one to take charge as the overall arbiter of VR development, working in turn with all the others to unify the standards, while presenting a more coherent and united front to video service providers.
You might also like...
In their latest hyper-realistic VR weather warning, The Weather Channel helps viewers better understand the potential dangers created by ice storms.
Captivating 3D graphics and electronically inserted field images have become a hallmark of every major live sporting event, but CBS Sports hopes to raise the bar during this year’s NFL Super Bowl LIII telecast on February 3, 2019. The sports network’s g…
Behind the more than 100 television cameras and an arsenal of the most advanced broadcast technology ever assembled, the anchors reporting the 53rd Super Bowl will concentrate on the ancient art of storytelling.
During Super Bowl LIII, the football action will be on the field. But a lot of the action will be enhanced by incredible new graphics, some virtual, that CBS is using to super charge the screen.
We editors, color graders and graphics artists are an opinionated group and that’s a good thing because with the speed technology is changing we need open communication among ourselves.