The Sponsors Perspective: Metaverse - The Next Chapter Of Broadcast

The broadcast industry has been evolving towards greater immersion since its inception. As technology evolves, so do the capabilities of broadcasters and content creators to bring audiences inside of their stories.


This article was first published as part of Essential Guide: Broadcast And The Metaverse - download the complete Essential Guide HERE.

Today’s immersive technologies and methodologies span the entire content pipeline from creation to delivery to consumption. They include 360-video capture, volumetric video capture, and spatial audio in content creation; augmented reality (AR) and virtual reality (VR) headsets for extended reality (XR) content distribution; and speech AI and data-driven visualization for interactivity during content consumption.

These technologies combine to deliver another layer to the viewing experience. This evolution of immersion in broadcast is taking us closer to the metaverse - the 3D Internet. The next step in this evolution is a shared virtual world. This is where audiences are headed, examples of which we’re seeing in other parts of the media and entertainment industry, from gaming to advertising to film.

Technologies That Enable The Metaverse

Delivering experiences to audiences in a 3D Internet requires using several technologies, many of which are already in use by media companies today.

Accelerated Computing

At its core is accelerated computing to process vast quantities of data generated in creating, delivering and consuming the virtual experience.

The Dell Precision 7865 workstation combined with the NVIDIA RTXTM 6000 Ada Generation GPU delivers impressive computing capabilities ideal for content creation and dissemination in the metaverse. This computational powerhouse boasts up to 64 cores of processing power via its AMD Threadripper CPU and comes equipped with up to two RTX 6000 professional graphics cards. This provides the computational resources to render high-quality, detailed 3D graphics, to train and deploy AI models, and to encode video to multiple platforms, including head-mounted devices. The Precision 7865 can easily exceed all the stringent power requirements for these workflows.

Content creators in this space need workstations that can handle vast amounts of data and provide extreme processing power, to handle the complex tasks required to create immersive virtual environments, avatars, and interactive experiences. The Precision 7865 workstation equipped with the RTX 6000 is more than up to the task. Its quick processing and rendering capabilities enable content delivery that is both fast and efficient, ensuring that users have a seamless and immersive experience.

Extended Reality (XR)

The Precision 7865 workstation supports a full range of professional NVIDIA graphics cards, making it capable of running high-resolution VR software and headsets. To provide customers with the best experience, Dell offers the Ready for VR program, ensuring customers choose a workstation and professional graphics card capable of driving superior VR experiences.

CloudXR, NVIDIA’s streaming technology, delivers VR and AR across 5G and Wi-Fi networks. Built on NVIDIA RTX technology, CloudXR is fully scalable for data center and edge networks.

With the CloudXR SDK, extended reality content from OpenVR applications can be streamed to Android and Windows devices, dynamically adjusting to network conditions for maximum image quality and frame rates. This frees users from traditional VR and AR confines, streaming complex experiences from remote servers across 5G and Wi-Fi networks to any device, wirelessly.

Platform For Metaverse Application

NVIDIA Omniverse Enterprise is a scalable, end-to-end platform enabling enterprises to build and operate metaverse applications. Omniverse is a real-time, large-scale virtual world simulation engine. A computing platform that enables 3D designers and teams to better connect and build custom 3D content creation pipelines.

Omniverse unlocks the entire scope of today’s 3D workflows – touching every single industry whether for building 3D assets and worlds or operating digital twins. Omniverse Enterprise is open, interoperable – built on Universal Scene Description – an open 3D framework and the foundation of 3D worlds, and MDL. It is easily extensible and customizable – customers can inspect, tweak, customize, build all our apps and extensions that we offer as source from scratch.

It is scalable from workstation including the Precision 7865 workstation, to data center, to cloud – and Omniverse performance scales just as much as the compute you throw at it. The platform can be deployed across hybrid infrastructure – and soon will be accessible from anywhere.

Avatars

Intelligent, lifelike avatars are a critical component of the broadcast metaverse. They heighten audience engagement in this new environment. Some examples of how interactive avatars can be used in broadcast include virtual news anchors, commentators and meteorologists that can be created to deliver the news in a virtual studio, virtual hosts who can guide participants through an event, and virtual celebrities and television personalities that can interact with fans about what they just watched.

NVIDIA Omniverse Avatar Cloud Engine (ACE) offers the fastest and most versatile solution for creating interactive avatars and digital human applications at-scale. Broadcasters can leverage ACE to animate 2D or 3D characters, and to give them the ability to speak and interact with users. The ACE end-to-end avatar development suite enables seamless integration and deployment, allowing broadcasters to build, configure, and deploy avatar applications across any engine in any public or private cloud.

Artificial Intelligence

AI is one of the cornerstones of the broadcast metaverse. AI is woven into many of the applications mentioned above, but it deserves a specific call out. Speech AI, computer vision, recommendation engines, and more come together to accelerate content production, increase the accessibility of content as it gets distributed, and make content more personalized in its consumption. These technologies will allow audiences to navigate through virtual environments, interact with avatars, request the content they want to see or ask for recommendations, and more.

Cindy Olivo - Global Media and Entertainment Marketing Manager - Dell Technologies (left) and Sepi Motamedi - Global Broadcast Industry Marketing and Strategy - NVIDIA (right).

Cindy Olivo - Global Media and Entertainment Marketing Manager - Dell Technologies (left) and Sepi Motamedi - Global Broadcast Industry Marketing and Strategy - NVIDIA (right).

The Time To Build Towards The Metaverse Is Now

With its immersive and interactive nature, the metaverse represents the next chapter for content delivery and consumption in broadcast. With the increasing popularity of virtual and augmented reality and interactive visualizations, viewers are looking for immersive experiences beyond traditional TV and streaming. By creating virtual worlds and interactive avatars, broadcasters can further engage existing audiences, expand their reach to new demographics, and create new revenue streams.

As the metaverse becomes more accessible, it can revolutionize entertainment. Broadcasters must be at the forefront of this change to stay in step with audiences in the ever-evolving media landscape.

Supported by

You might also like...

Standards: Part 9 - Standards For On-air Broadcasting & Streaming Services

Traditional on-air broadcasters and streaming service providers use many of the same standards to define how content is received from external providers and how it is subsequently delivered to the consumer. They may apply those standards in slightly different ways.

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

Designing IP Broadcast Systems: Part 3 - Designing For Everyday Operation

Welcome to the third part of ‘Designing IP Broadcast Systems’ - a major 18 article exploration of the technology needed to create practical IP based broadcast production systems. Part 3 discusses some of the key challenges of designing network systems to support eve…

What Are The Long-Term Implications Of AI For Broadcast?

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.