Essential Guide: Immersive Audio Pt 3 - Immersive Audio Objects

July 24th 2019 - 01:00 PM
Paul MacDonald, Writer, Professional Broadcast Audio

Immersive audio transforms the listening environment to deliver a mesmerizing and captivating experience for a wide range of audiences and expansive group of genres.

Part 3 in our immersive Audio series examines object audio and the core technology that empowers producers and sound engineers to deliver compelling auditory experiences.

We start with defining the key differences between traditional channel-based mixing and object programming. Spatially defined objects must be described using meta-data to allow us to fully create the immersive experience and how we record and express these must be understood. We investigate the methods available for this.

Using the blockbuster film Gravity as an example, we analyze and uncover the object methods used within the production context. Listener orientation is a key component in creating an effective mix and the psychological impact is further considered.

In their case study, Sennheiser provide an outstanding description of how to deliver location recording and mixing for production. They discuss the specialist microphones needed to truly enhance the immersive experience by recording the most optimal object audio possible.

With an in-depth description of object reproduction, Genelec Senior Technologist Thomas Lund uncovers the best strategies and requirements for loudspeaker placement to deliver accurate immersive audio. He digs deep into the standards and answers the age-old question “can I monitor using headphones?”

Lawo’s Christian Scheck discusses the functions available for immersive audio production. He looks at advances in technology and what we should expect for the future. Scheck goes on to discuss new methods of the user interface and how object monitoring solutions are being designed to deliver the best immersive sound possible.

This Essential Guide, part 3 of the series, continues our journey through immersive audio and object sound, and its applications in broadcast television.

Download this Essential Guide now to better understand immersive audio and object sound.

Part of a series supported by

You might also like...

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

AI In The Content Lifecycle: Part 5 - Ethical Broadcasting And Regulatory Compliance

Broadcasters and video service providers are looking at AI to police the regulatory and ethical problems it has created, as well as bear down on some longer standing challenges. The latter include ensuring that content developed in one country complies…

Designing IP Broadcast Systems: NMOS

SMPTE have delivered reliable low latency video and audio distribution over IP networks, but it’s NMOS that is delivering solutions to discovery & registration challenges that satisfy operational requirements.

HDR & WCG For Broadcast - HDR Picture Fundamentals: Color

How humans perceive color and the various compromises involved in representing color, using the historical iterations of display technology.