Virtual Hype Became Production Reality In 2023

Michael Grotticelli shares his personal perspective on the technology and business trends that have defined 2023 and some observations on where we might be headed into 2024.

Looking back at the year that was 2023, the broadcast industry navigated a number of technical and budgetary waters that saw the landscape for content continue to change while the need for new hardware-based production systems declined. Meanwhile, new technologies like AI, virtual reality, remote production and virtual production continued to thrive as part of advanced production workflows as the new NextGen TV standard continues its slow rollout to a public mostly unconvinced of its value.

Backing this up, market research firm Devoncroft Partners predicts that the total market value for broadcast and media technology products and services declined by 4% in 2023 to approximately $61.5 billion. This reduction in spending is due to production companies making do with the technology they have in place while slowly moving to IP infrastructures that bring flexibility and cost-effectiveness to traditional production workflows.

In addition to improvements in real-time, data-driven graphics (AR and VR) and meta-driven workflows, 2023 will go down as the year artificial intelligence (AI) was universally commercialized and embraced by the industry on many levels, while production workflows were energized by decentralized infrastructures and collaborative access from anywhere in the world.

Artificial Intelligence

In the past year AI has been an extremely fashionable topic. It is driving innovation and pushing the boundaries of video production technology and advanced workflows. Indeed, generative AI is now being used in newsrooms to create texts and reports while reducing the time spent creating them. Human editors still play an active role in the publication process, and they must review every piece of content, ensuring the integrity and authenticity of the news.

Generative AI tools like ChatGPT have become widely available, with half of all newsrooms worldwide already using them, according to a survey by the World Association of Publishers (WAN-IFRA), which found that newsrooms are deploying them.

A recent industry survey found that nearly half of all newsrooms in the world now use some type of AI tool.

A recent industry survey found that nearly half of all newsrooms in the world now use some type of AI tool.

Auto-generated, real-time metadata in sports broadcasting is becoming a standard practice, enabling broadcasters to generate significantly more information from their sports content than manual tagging. Metadata tagging is crucial to content discoverability, user recommendations and cataloguing, as this data links a particular event to a clip store within a vast content library.

For example, during a football match telecast, AI engines can identify scenes within the match, such as goals, fouls, and celebrations. They analyze the keywords present, such as player names, team names, and stadium names. They can also identify brands visible on players’ jerseys, logos displayed on stadium billboards, famous faces in the crowd, and even product mentions during interviews or sponsorships.

This data is then tagged into the clips as metadata. If it weren’t for AI, these data points would be lost in the vast content library and the discoverability of these clips would be limited.

Large tech companies like Google, Microsoft and OpenAI are interested in the archives of media organizations to use such data to train their Large Language Models. While allowing access to their archives may be of benefit for media organizations, many have expressed concerns over allowing free access to their MAM archives.

Another favored application of AI in a newsroom is the transcription of audio interviews into written content. Unlike manual transcription, which is time-consuming and prone to errors, AI-enabled transcription services present a faster and more accurate alternative. Leveraging AI-driven transcription services makes the content readily available for editing, publishing, or sharing across different platforms, thereby enhancing the newsroom’s ability to repurpose or reference past content.

Expect to see AI making inroads into most aspects of production and distribution, as the industry learns to harness the power and usability of the technology without being afraid of losing your job to it.

Remote Production

The emergence of decentralized operations has brought many efficiencies for producing content in the most cost-effective way and this area saw many advances in system configuration and productivity. This has required a full move to IP infrastructures, but financial realities have seen studios and production companies making the move in carefully thought out phases.

Many production companies now operate with a hybrid SDI-IP backbone that features a number of gateways and conversion steps to move back and forth between the SDI and IP technical standards.

Many production companies now operate with a hybrid SDI-IP backbone that features a number of gateways and conversion steps to move back and forth between the SDI and IP technical standards.

This has meant that many companies now operate with a hybrid SDI-IP backbone that features a number of gateways and conversion steps to move back and forth between technical standards. Production facilities are not quick to throw away existing hardware systems as they gravitate towards an IP future, so expect to see hybrid operations for several years to come.

At the IBC Show in Amsterdam this year, ARET, an Italian audio and video systems integrator, unveiled a new outside broadcast (OB) trailer for Alamiya Media that features both SDI and IP signal distribution. It’s loaded with Ross Video 12-G SDI Hyperconverged hardware video systems and Lawo IP-based audio solutions in an Ultra HD, high dynamic range (HDR) video workflow at 12 Gps with AES67 audio networking.

More studios and OB vans are coming online now that offer the flexibility to do both in order to handle the most amount of content from a variety of sources. This will continue moving forward for the next several years.

Virtual Production

This year we also saw a rise in virtual production, that is, TV shows and feature films shot against a studio outfitted with large LED screens on all sides (and even on the floor and ceiling). This has helped reduce production costs while also facilitating exciting special effects and animated elements combined with live action characters in ways that have never been seen before.

ESPN’s Studio X includes nearly 38 million pixels of new LED display technology.

ESPN’s Studio X includes nearly 38 million pixels of new LED display technology.

For example, ESPN, in Bristol Conn, has redesigned its flagship “Sports Center” studio (in celebration of 44 years on the air), and renamed it Studio X. It includes 38 million pixels of LED, including a massive 48-foot display (the largest on ESPN’s campus), along with a flexible anchor desk. To elevate the show’s production, ESPN turned to a depth monitor to bring dynamic virtual content into the space.

What this and many other virtual production studios around the world have in common is a desire for high resolution images that make the set smore immersive for the viewer.

What customers are looking for is hyper realism, and of course, flexibility for creating different content for a variety of applications. This means that the tools must ensure not only a perfectly realistic background scene, but also the ability to include data-driven graphics, control of other hardware such as studio lights, and the compatibility with broadcast workflows. These graphics are triggered by incoming data feeds for things like live election results, sports scores and even severe weather events.

While some believe virtual production and XR is only possible with LED screens, the reality is that there is no technical reason why virtual content can’t be created using more traditional green-screen based virtual sets, LED screens, or both at the same time.

ATSC 3.0 (NextGen TV)

Proponents of the NextGen TV system (ATSC 3.0) continued their quest to get broadcasters excited about a new terrestrial transmission system that requires consumers to buy new compatible TV receivers and stations themselves to spend extra money on a technology that has not been wholly embraced by the public.

NextgGen TV is struggling to find its footing with U.S. Broadcasters.

NextgGen TV is struggling to find its footing with U.S. Broadcasters.

Those looking for 4K programming will find that nearly everything broadcast right now is just a simulcast of a station’s HD channels.

In October 70 percent of U.S. Homes had a NextGen TV signal available to them, but only 10,000 TV sets capable of receiving features like 4K resolution, HDR, and interactive capabilities had been sold across the entire U.S., according to Pearl TV, one of the major groups in support of the new transmission standard. Another hurdle to widespread adoption was created in September when LG Electronics announced it will not include ATSC 3.0 NextGen TV tuners in its 2024 TV lineup for the U.S. market. The decision comes after LG lost a patent infringement lawsuit earlier this year brought by Constellation Designs, LLC, over technology used in NextGen TV.

What still remains to be seen is whether this new over-the-air standard will be enough to stem the tide of the increasing number of OTT services that continue to be come online. 2024 will be a make-or-break year for the technology, as time appears to be running out on broadcasters making it a viable business.

Conclusion

With budgets tight and layoffs continuing to be a concern, one media production professional said the state of the industry currently can be described as “healthy but concerned.” There’s lots of content being produced, but it has to get to more outlets in more faraway places than ever before in order for the myriad of business models now being tried to work.

Software-centric systems and distributed architectures that include cloud and other delivery methods will continue to evolve, as they offer the flexibility to do more and change quickly if required. Choosing the right technology has always been the key to success. In 2023, it just got a bit harder to figure out.

You might also like...

Future Technologies: New Hardware Paradigms

As we continue our series of articles considering technologies of the near future and how they might transform how we think about broadcast, we consider the potential processing paradigm shift offered by GPU based processing.

Standards: Part 10 - Embedding And Multiplexing Streams

Audio visual content is constructed with several different media types. Simplest of all would be a single video and audio stream synchronized together. Additional complexity is commonplace. This requires careful synchronization with accurate timing control.

Designing IP Broadcast Systems: Why Can’t We Just Plug And Play?

Plug and play would be an ideal solution for IP broadcast workflows, however, this concept is not as straightforward as it may first seem.

Future Technologies: Private 5G Vs Managed RF

We continue our series considering technologies of the near future and how they might transform how we think about broadcast, with whether building your own private 5G network could be an excellent replacement for managed RF.

Standards: Part 9 - Standards For On-air Broadcasting & Streaming Services

Traditional on-air broadcasters and streaming service providers use many of the same standards to define how content is received from external providers and how it is subsequently delivered to the consumer. They may apply those standards in slightly different ways.