Calrec’s ImPulse is an audio processing and routing platform that includes native IP connectivity to support AoIP networking.
Veteran audio console maker Calrec Audio has been busy developing new “IT-friendly” products that it says eases the way for professionals looking to implement remotely controlled live productions for live sports and entertainment. The company released ImPulse, an audio processing and routing engine with support for AES67 and SMPTE 2110 audio networking over an IP connection, in the fall of 2018 and the system has continued to evolve ever since. Among its many features, ImPulse provides increased DSP power, allowing for the creation of immersive audio (5.1 and 7.1) content.
As Pete Walker, Senior Product Manager at Calrec explains it, the signal processing engine is fully compatible with existing Apollo and Artemis control surfaces (providing familiar control) and offers an upgrade path for existing Calrec customers as they transfer to IP infrastructures. With control connectivity via IP, surfaces can be physically remote and connected over standard networks using COTS hardware.
The Broadcast Bridge: What was the thinking behind ImPulse?
Walker: We recognised the demand to use open standards for the exchange of both audio and control data, along with increasing DSP requirements to produce next generation immersive content. ImPulse allows us to move to a more virtualized environment, supporting multiple productions simultaneously and allowing for processing to be deployed more efficiently when and where it is needed.
ImPulse also provides an upgrade path to AoIP (AES67/SMPTE 2210-30), with scalable and field-upgradable DSP and routing capacity, freedom of location to help power remote working scenarios, and immersive audio capabilities to support the changing demands of modern broadcasting. The additional processing overhead also allows us to develop new features to support this demand as we move forward into a new era.
The Broadcast Bridge: How does it work in terms of IP upgrade path?
Walker: Taking the IP upgrade path first, central to the design of ImPulse was the need to embrace open IP standards and COTS IP hardware. Our Hydra2 networking technology is very successful at what it does: removing the hard ties between control room and studio; allowing for audio to be shared flexibly; eliminating rigging and teardown time; and easily plug and play. It provides lots of flexibility and has enhanced many workflows as a result.
However, broadcasters want shared networks that allow for the exchange of audio and video more directly, between devices from different vendors, to streamline workflows and reduce cabling/connectivity formats. Hydra2, being proprietary, means it cannot share audio directly with non-Calrec devices. It also does not pass over standard IT switches; consolidating connections to stage-boxes requires relatively expensive Calrec routers so customers tend to run quite a lot of dedicated fibre on larger systems.
To reduce the amount of fibre being run for passing proprietary data, many broadcasters leverage CWDM technology, multiplexing signals by using different frequencies of light for each so they can travel down the same piece of fibre. This alleviates some of the installation overhead and costs, but it still requires format conversion and interfacing to pass media between kit from different vendors. The use of open IP standards for transporting media massively reduces the cost and physical space required for the multitude of connectivity formats that are in use for audio and video today.
Ideally, customers want to be able to use the office LAN and even public internet and Wi-Fi, so that devices can be conveniently connected from anywhere. This will come, but critical live broadcast demands low latency which requires a well-managed network configured for ST2110/AES67 use to safeguard content while allowing for flexible workflows for day-to-day operational needs. Video presents a huge amount of data compared with audio, but that data only needs refreshing 60 times per second for high quality, whereas each audio channel needs to be processed 48,000 times per second at least to present a quality output.
Calrec’s Bluefin2 platform, in conjunction with Hydra2 audio networking, has served broadcasters all over the world. ImPulse contains the next iteration of DSP, aka, “Bluefin3”, which is modular and scalable, allowing users to expand the system when they need to.
A key point of ImPulse is that it can be implemented in stages. Broadcasters do not need to jump to IP with both feet overnight. Many of our customers have made a significant investment in Hydra2-based Calrec hardware over the last decade or so. Open standards prevent customers having their purchasing decisions forced by their previous investment, allowing for choice in each control room while maintaining facility-wide connectivity, but we do not want to write off existing investment in hardware.
As a broadcast vendor, it is very important that we both maintain existing systems and support our customers to migrate at a speed at which they feel comfortable.
The Broadcast Bridge: What about the increased DSP aspect? Why is it so powerful?
Walker: Once a show starts rehearsals, production ambitions usually increase demand on resources. One of the things Calrec users like is never having to worry about running out of DSP; a Hydra2-based Apollo provides 1020 channels just for audio inputs alone, which is hugely powerful, though we do have a small number of customers who are close to maxing this out.
With support for immersive audio and its wider path widths, we wanted to increase the number of both input channels and buses on ImPulse to allow productions to move to next generation audio without compromising the number of paths they can process.
ImPulse is also designed to support multiple fully independent productions simultaneously. It can run up to four separate mixers, each one up to 1,122 input channels and 336 buses - that’s 4,488 channels of input processing and over 1,300 channels for buses in one ImPulse core. But it’s equally capable of running just a single mixer down to the scale of an Artemis Light. DSP can be upgraded when it’s needed through a combination of hardware modules and software licenses.
The Broadcast Bridge: What about remote working?
Walker: ImPulse fits into this model. It’s not only about 2110/AES67 AoIP, it’s about the whole infrastructure. Obviously, there’s the connection between the control surface and the processing core. Control can be provided from a web-UI, but there’s also connectivity with studio management/orchestration systems and remote control from other vendor’s systems. By ensuring that ImPulse has enough data ports and is very configurable, using IT friendly open standards, it can be located anywhere.
We have customers interested in installing ImPulse on trucks to enable remote control from their broadcast centers to allow for low-latency monitor mixing at venues using remote production. Conversely, we have customers interested in just having control surfaces in trucks, connecting to processing engines in their facilities, to downsize the cost and weight of vehicles as well as reduce engineering overheads at venues. Larger broadcasters with multiple facilities want to consolidate and centralize their equipment rooms, servicing productions across the country. Being IT friendly, ImPulse allows for all of these scenarios as well as the traditional single mixer on a truck or in a control room.
There’s no one single way to structure remote working and we are working across our complete product range to ensure that it’s optimized to handle these situations. The more flexible we can be in terms of where kit is located and how it’s connected, the better it is for our customers. We don’t want any barriers to changing ambitions and requirements. People are used to connecting with anyone from anywhere and broadcast applications should be no different.
The Broadcast Bridge: How does immersive audio affect the design process?
Walker: Over the last couple of years, our users have started migrating to next generation audio and producing Dolby Atmos—among others—by adding channels to each path to add height legs, as well as adding objects to their mix. That’s quite a lot of extra DSP being used, and we need to make sure that we provide enough so there’s no compromise. We’ve added height legs and height panning to provide native immersive input channels, buses, monitoring and metering. People can produce immersive content on regular stereo or 5.1 mixers, but on ImPulse we make it much easier. As well as mono, stereo and 5.1 paths, we now also offer 5.1.2, 5.1.4, 7.1, 7.1.2, and 7.1.4 paths as well as 0.0.2 & 0.0.4 height only paths. All of these formats can coexist and be routed to/from each other with comprehensive up and down-mix parameters.
The Broadcast Bridge: How does flexible additional DSP power help customers?
Walker: Customers don’t like to have to define what their maximum usage is going to be. For example, customers might buy a big Apollo, but they really only need that size for one-off major events. They have to order a console to cater to the largest jobs they will do. These are likely to only come along rarely, with day-to-day operations often under-utilizing the system. With an ImPulse core, customers can run up to four independent mixers and each can be a different size, allowing customers to choose which to use for a given production. They can flexibly deploy their DSP where it is needed on a case-by-case basis, rather than being limited to choosing a whole control room/truck for a job. Additionally, if they do run out of processing capabilities on a given mixer, they can quickly unlock more through licensing.
ImPulse is a part of our continued move into a form of virtualization; customers can attach up to four surfaces onto a single ImPulse core, each controlling fully independent mixers within. Customers don’t have to buy a processing core for each control surface that they buy. This provides cost savings, reduces the amount of idle downtime and helps customers deploy DSP power where they need it.
ImPulse can handle a huge number of input channels; up to 4,488. It has well over four times the capability of a Hydra2-based Apollo. It can run one to four mixers and each mixer can be from 256 up to 1,122 input channels.
COTS IP connectivity is a key part of the Impulse engine, providing flexibility and more freedom in geographic location.
The Broadcast Bridge: What about network control data?
Walker: IP data is not only about standardizing the transport of audio, it’s about standardizing networking control data. If you wanted to extend the link between a surface and a core, that was possible with Hydra2, but it wasn’t standardized. ImPulse is designed from the ground up to use open, standardized IP. It makes it possible to do things like have a surface in New York with the processing core in LA.
The direction in which a lot of our customers are moving in is to operate remote equipment rooms, which are turning into server centers. New York is a good example of where real estate is very expensive and they don’t want—or now need—to have equipment rooms on-site there. They can either increase studio space for the same cost or downsize the space they use by moving their equipment rooms to remote locations. It also makes it easier to manage their equipment by centralizing engineering staff. The same technologies also allow for our own engineers here at Calrec to better support our customers. We do not need to be physically on-site to change configuration or diagnose problems.
Maybe a customer has a surface in New York that is used for the news from 7-9 a.m. Then once that’s finished, it sits idle for some time. Instead, ImPulse can allow for its DSP to be used by another surface for a production elsewhere in the country. It is designed to allow customers to make their investments more efficient, deploying their processing power when and where it is needed.
You might also like...
With the advent of immersive audio mixing using codecs like Dolby Atmos and DTS:X (the successor to DTS HD) professionals now have the ability to create interactive, personalized, scalable and immersive content by representing it as a set of…
Strategies for capturing immersive audio for scene and object-based audio.
Genelec Senior Technologist Thomas Lund starts down the road to ideal monitoring for immersive audio by looking at what is real, and how that could or should be translated for the listener.
Lawo’s Christian Scheck takes a tour of console functions and features that have a special place in immersive audio production, and how they are developing.
When the pandemic began shutting down TV stations in the spring of this year, journalists and producers were left to figure out how to work from home and set up technical systems they were very unfamiliar with. In many cases…