Immersive audio transforms the listening environment to deliver a mesmerizing and captivating experience for a wide range of audiences and expansive group of genres.
Part 3 in our immersive Audio series examines object audio and the core technology that empowers producers and sound engineers to deliver compelling auditory experiences.
We start with defining the key differences between traditional channel-based mixing and object programming. Spatially defined objects must be described using meta-data to allow us to fully create the immersive experience and how we record and express these must be understood. We investigate the methods available for this.
Using the blockbuster film Gravity as an example, we analyze and uncover the object methods used within the production context. Listener orientation is a key component in creating an effective mix and the psychological impact is further considered.
In their case study, Sennheiser provide an outstanding description of how to deliver location recording and mixing for production. They discuss the specialist microphones needed to truly enhance the immersive experience by recording the most optimal object audio possible.
With an in-depth description of object reproduction, Genelec Senior Technologist Thomas Lund uncovers the best strategies and requirements for loudspeaker placement to deliver accurate immersive audio. He digs deep into the standards and answers the age-old question “can I monitor using headphones?”
Lawo’s Christian Scheck discusses the functions available for immersive audio production. He looks at advances in technology and what we should expect for the future. Scheck goes on to discuss new methods of the user interface and how object monitoring solutions are being designed to deliver the best immersive sound possible.
This Essential Guide, part 3 of the series, continues our journey through immersive audio and object sound, and its applications in broadcast television.
Download this Essential Guide now to better understand immersive audio and object sound.
You might also like...
Development of new technology and moving to the newly available 5GHz spectrum continue to expand the creative and technical possibilities for audio across live performance and broadcast productions.
The IP Showcase is a highly anticipated event at the NAB Show in April that annually brings together a myriad of companies with complementary IP technology that spotlights “real world” applications using third-party products. Attendees like it because they get a h…
Microservices enable broadcasters to find new ways to adopt, engineer, operate and maintain the value of their solutions. For vendors, microservices provides opportunities to offer what could essentially be a self-serve menu for clients rather than building bespoke workflows internally.…
In Part 2 we looked at solutions to keep AoIP systems simple and discussed the compromise and efficiency vendor specific network systems provide. In Part 3, we look further into system management and network security.
Error correction is fascinating not least because it involves concepts that are not much used elsewhere, along with some idiomatic terminology that needs careful definition.