Genelec Monitors Power Research At Huddersfield’s Applied Psychoacoustics Lab

The Applied Psychoacoustics Lab (APL) at the UK’s University of Huddersfield is an experimental hub created to advance our knowledge of the mechanism of human auditory experience, and provides perceptually-motivated solutions to audio engineering problems. Opened in 2013, its critical listening room has always relied on Genelec, and a recent upgrade to Genelec’s ‘The Ones’ Smart Active Monitors is supporting further advances in research.

“Recently we've been focusing on virtual acoustics for extended reality applications,” explains Prof Hyunkook Lee, Founder/Director of the APL. “We've worked on a project that developed a six degrees of freedom audio augmented reality processing engine, which led us to develop binaural renderers, such as Virtuoso, which we just released.

“We're also conducting lots of experiments using VR headsets and also display systems to look into the interaction between audio cues and visual cues,” he continues. “That's highly relevant for creating immersive experiences. It's not just audio that gives you an immersive audio experience – because we see things in real life. We're investigating how we perceive the immersive experience, how we can enhance it while we're watching films or listening to music, and what kind of perceptual parameters actually provide this kind of experience.”

For the last decade, 24 Genelec 8040 monitors combined with a pair of 7070 subwoofers have been used to reproduce audio in APL’s critical listening room. However, a recent upgrade has seen 15 of the 8040s replaced with 8341 three-way coaxial monitors from The Ones series.

“There were two reasons basically,” recalls Lee, discussing the decision for the upgrade. “The Ones provide excellent tonal consistency wherever you sit in the room, which is very important when you have a lot of people in this space. When we hosted a recent AES International Conference on Spatial and Immersive Audio, we had 21 people in this room. And wherever they sat, they had an excellent experience. The tonal balance was very consistent across the room, which was very important for this kind of demonstration situation.

“The second reason was for our research,” he continues. “We needed coaxial monitors because when you do localisation tests, the acoustic centre position is always important. With the 8040s, you have to take the average between the tweeter and woofer. But now with The Ones series, we know exactly where the acoustic centre is.”

The new setup allows APL to create a Dolby Atmos 9.1.6 space, while the remaining 8040s ensure that this can be expanded to cover higher channel count formats such as NHK’s 22.2 standard. In the expanded version, nine of the 8040s are deployed in the floor, height and rear centre positions.

A further advantage that APL has found from upgrading is the simplicity of room switching made possible with GLM software. “We can tune the whole room with the 9.1.6 system in less than five minutes and that was a big factor,” says Lee. “GLM makes a huge difference, especially with immersive audio. Of course, you get a very significant difference with stereo as well. But with a 9.1.6 system with so many monitors working together, the fact that we can actually tune the entire system to the room is a great advantage.”

With the new system in place, APL is continuing its efforts to help improve our understanding of immersive audio environments. “Recently, we've been focusing on binaural audio for virtual monitoring and extended reality applications,” explains Lee. “And my current research focuses on what kind of roles audio plays in providing an immersive experience. And for that, it's all about understanding what content producers really think about immersive audio, and what kind of experience users expect from these immersive systems. We need to understand each other and try to narrow the gap and work together in a collaborative environment. Composers, producers, engineers, researchers and developers all need to get together to discuss what really makes spatial audio truly immersive.”

You might also like...

HDR & WCG For Broadcast - Expanding Acquisition Capabilities With HDR & WCG

HDR & WCG do present new requirements for vision engineers, but the fundamental principles described here remain familiar and easily manageable.

What Does Hybrid Really Mean?

In this article we discuss the philosophy of hybrid systems, where assets, software and compute resource are located across on-prem, cloud and hybrid infrastructure.

AI In The Content Lifecycle: Part 5 - Ethical Broadcasting And Regulatory Compliance

Broadcasters and video service providers are looking at AI to police the regulatory and ethical problems it has created, as well as bear down on some longer standing challenges. The latter include ensuring that content developed in one country complies…

Designing IP Broadcast Systems: NMOS

SMPTE have delivered reliable low latency video and audio distribution over IP networks, but it’s NMOS that is delivering solutions to discovery & registration challenges that satisfy operational requirements.

HDR & WCG For Broadcast - HDR Picture Fundamentals: Color

How humans perceive color and the various compromises involved in representing color, using the historical iterations of display technology.