Psycho-physical aspects of VR need to be further evaluated, according to the EBU (photo by Andrew Robles on Unsplash).
The EBU (European Broadcasting Union) has urged the whole video service industry to unite behind Virtual Reality and avoid the mistakes made with previous attempts to deliver new experiences for users, such as 3D.
On the one hand broadcasters and pay TV operators need to respond to the challenge posed by the big Internet players with their big investments in VR but on the other they need to balance costs and challenges against opportunities. The EBU has distilled views and VR deployment plans of member broadcasters in a report just published, Virtual Reality: How are public broadcasters using it?
The report was presented at Observatorio VR in Málaga on 21 July with speakers from some of the Internet players including Google, Samsung, Nokia, Sony Playstation, and HTC Vive. The EBU noted how Facebook has acquired Oculus for example while Google has significant investments with its Daydream and Jump projects, with Samsung, Sony and Nokia also spending heavily. At the same time broadcasters were weighing in with trials and some availability of content.
The greatest early activity has involved 360-degree video which can be seen as a stepping stone towards full VR, with 49% of EBU members having made some such surround content available, a further 9% in trials and 19% at least planning to start doing so. When it comes to full VR, currently 19% of EBU members have made some content available with a further 18% at the trial or planning stage. The third category surveyed by the EBU was Augmented Reality (AR), where 7% of EBU members had made some content available and 21% at the trial or planning stage.
Of course, the EBU had to define these categories in order to sample opinion and struggled a bit over VR itself. It conceded that VR could just be 360° video when the content is primarily video based, but that was confusing given this was regarded as a separate category for the survey. It could also include computer-generated VR (CG), when the content is primarily rendered from a 3D model in real time in the user’s device. It could also be both, while incorporating panoramic 2D or stereoscopic images viewed on head-mounted displays.
The EBU was clearer on AR which it defined as an overlay of CG on images of the real world, but with no interaction between them. The real-world content and the CG content are not able to respond to each other, as the EBU put it. Then just to muddy the waters the EBU discussed Mixed Reality (MR) as also an overlay of synthetic content on the real world, but in this case fully integrated to allow interaction between the CG and real-world images.
One of the biggest challenges for VR is reducing the “motion to photon” latency to 20 ms or less to avoid causing nausea (photo by Samuel Zeller on Unsplash).
There is also the idea of immersive VR, which surely is what VR should be about anyway, creating the illusion of being present in the scene. This will involve audio as well, which the EBU regards as just as important for the experience as video. The report argued that emerging object based video linking sounds with visual objects such as cars or helicopters in the scene was merely a step towards realistic 3D, positional, surround audio. Such VR audio should apply specialization to present sounds from any direction and give users cues where to look next, through a natural listening experience. For this to work properly the ability to track the user's head orientation and position will also be crucial.
This is a way off but meanwhile broadcasters face various challenges to implement VR technology available today, including lack of skills, tools and vision, necessarily short viewing times, no established workflows, technical quality still not good enough, lack of distribution network capacity and unknown ROI (Return on Investment).
One of the biggest specific technical challenges for VR lies in reducing the “motion to photon” latency, which is the amount of time between an input movement such as a head turning and the screen being updated in response to that. Research has shown that a value of less than 20 ms is needed to produce a realistic experience and avoid making users feel sick. Even given a frame rate of 60 Hz the display will be updated only every 17 ms, which is quite close to the limit so that the other processing steps required for updating the display would need to be excessively optimized. Probably frame rate will have to be at least 90 Hz.
Given these challenges, the EBU advocated cautious investment to avoid over commitment at this early stage where the standardization scene was still like the wild west. All the usual bodies have piled in, including the DVB, W3C with WebVR and ITU-R with Advanced Immersive Audio Visual (AIAV) Systems for Programme Production and Exchange. There are also MPEG with the Omnidirectional Media Application Format (OMAF) standard as well as the Media Orchestration Interface (MORE) for video stitching and encoding and JPEG developing JPEG XT (omnidirectional photographs), JPEG XS (low latency compression formats for VR) and JPEG PLENO (light field video format). DASH-IF is planning tests and trials of VR delivery using DASH technology, while IEEE has started to define different categories, levels and formats for immersive video, as well as the functions and interactions enabled by these formats. 3GPP is investigating VR for possible use for 5G and looking at VR standardization for wireless mobile services, while considering delivery of VR video content on both current 3G/4G/LTE and 5G systems. On top of that the Video Quality Experts Group (VQEG) has created an Immersive Media Group (IMG) whose mission is to carry out a "quality assessment of immersive media, including virtual reality, augmented reality, stereoscopic 3DTV, multiview". Finally, Khronos group has announced a VR standards initiative that resulted in OpenXR (CrossPlatform, Portable, Virtual Reality), defining APIs for VR and AR applications.
No wonder service providers and broadcasters are feeling somewhat queasy themselves. Surprisingly the EBU’s report failed to compare the situation with the closely related but more tightly confined area of Ultra HD, which was also once likened to the wild west but where some order has now emerged from the chaos with growing alignment between the Ultra HD Forum working on the infrastructure and UHD Alliance concerned with content and the viewing devices at either end.
The VR world similarly needs two bodies or preferably just one to take charge as the overall arbiter of VR development, working in turn with all the others to unify the standards, while presenting a more coherent and united front to video service providers.
You might also like...
360-degree video is hot. Global 360-degree camera sales are expected to grow at an impressive CAGR of more than 35% through 2020. When will 360-degree news production begin? It’s happening now say some experts.
Color grading may be one of the most processing intensive special effects in post production, but many call it the “unseen VFX”. In the first installment of this three-part series we looked at its current state because, when done properly, the…
Many broadcasters and sports production companies are migrating to HDR production. However, this move is not straightforward. Just as the move from 4:3 to 16:9 raised many issues, the move to a high dynamic range (HDR) and a wider color gamut (WGC)…
When done properly, color grading may be the most resource-intensive production process done today. Even so, the results are often amazing.
As more and more broadcast facility operations migrate to automated production and distribution systems, companies that market technical furniture are now offering next-generation products that accommodate less equipment (and operators) and consume less space while supporting the use of new…