2022 HPA Tech Retreat Returns In Person

After a year of meeting virtually, the Hollywood Professional Association (HPA) is hosting its Technology Retreat conference in person this year and members could not be happier. The highly anticipated gathering of the industry’s forward-looking technologists working at many of the largest companies in the U.S., is being held February 21-24 at the Westin Mission Hills Golf Resort & Spa in Rancho Mirage, Calif.

It’s four days in the sun hobnobbing with some of the country’s most brilliant Broadcast and Production (and Post) technology minds.

A wide range of topics will be spotlighted across the four-day event, from virtual production and cloud processing to HDR workflows, AI-based video codecs, Next-Gen TV, Edge computing, virtual reality and more. The technology exhibit area will feature a number of live demonstrations and a few novelties: a complete camera (including optical system) the size of a grain of salt, TV set you can taste, and a media-equipped refrigerator — from 1956.

Be sure to sign up for the event’s daily “roundtables” focused on specific topics, with up to 30 topics on the agenda. For example, there’s a roundtable on "Broadcasting at the Edge." But understand that space at the tables is filling up fast.

“This year we’re hosting an in-person-only event with extraordinary health & safety precautions,” said Mark Schubin, "Program Maestro" for the HPA Retreat. “Therefore, the most valuable part of the event might be the in-person networking.”

Indeed, past Retreats have seen the president of Sony Pictures sitting next to a senior member of the Sinclair Broadcast team, so you never know who you’ll meet (or see for the first time since the pandemic started).

As usual, Schubin will present his “Technology Year in Review” report, filled with colorful anecdotes and uncanny comparisons to technology from years ago.

Yet it also covers things up to the last minute, “so I can't tell you everything that will be in it, but it will include some wild -- almost unbelievable -- developments in satellites, cameras, and lenses at the very least. There will also be some revelations about carbon impact that I found quite interesting.”

Tuesday of the conference will feature a full agenda of sessions covering virtual production inside the camera for VFX work. These sessions have been coordinated by co-chairman Erik Weaver, who’s day job is running the Adaptive and Virtual Production department at the Entertainment Technology Center at USC. He and his team have been responsible helping create the now standardized Digital Cinema and IMF formats that streamline feature film effects and live production work.

One session: “Introduction to the VAD and Final Pixel”, (Tuesday, 9:25-10:15 am) will boast a stellar panel discussing the fundamentals of how you create a digital world using Virtual Art Department (VAD) cloud-based technology and Final Pixel processes that result in photo-realistic images. It uses Unreal Engine to render images in real time.

The popular Technology Exhibits area will return again this year.

The popular Technology Exhibits area will return again this year.

“With AI or volumetric video capture, it’s a whole bunch of cameras pointing in at the subject and then they are inserting that into a 360 world,” said Weaver. “This is a real human being on a stage that’s being photographed with all of your visual effects in real time through the camera. And it leverages parallax processing so that as the camera moves, everything moves in proper perspective in a real time nature.”

Panelists include Kristin Turnipseed, Lux Machina Consulting; Felix Jorge, Happy Mushroom; Ben Baker, ETC; Dane Smith, The Third Floor; and Arvind Arumbakkam from Wacom. All are experts in the field and are currently participating in real-world trials of the innovative processes.

Next up, at 1:40 PM on Tuesday, is “Optimizing dvLED Performance for Virtual Production (Or Any In Camera Usage)” presented by Gary Feather, who will take participants on a journey of LED technology and the benefits of each type.

Display used for Virtual Production or xR is referred to as dvLED, direct view LED Display. The system design and performance of the selected dvLED display can enhance or hinder the utility for a production; with the controller being an essential element. This presentation will cover discussion on the optical, electrical, display and physical characteristics that must be considered.

Also on the Tuesday, at 4:05 pm – 4:30 pm, a session called “Final Sample - On Set Virtual Production Sound Challenges.” It will look at the acoustical equivalent of virtual production’s final pixel, called “final sample,” which is used to capture and deliver spoken on-set performances directly to audiences, with minimal post processing or ADR. Acoustical camera technology visually demonstrates challenges in 3D, including sound reflection echoes. This session will be led by Eric Rigney, Executive Vice President of the Media & Entertainment Data Center Alliance, who in his day job is focused On-set Virtual Production technologies, workflows, and infrastructure.

On Wednesday, February 23rd, 10:30 am – 11:15 am, Mobile TV Group’s Mark Chiolis will lead a panel entitled, “Remote, Mobile, and Live Workflow Innovation Updates.” It will look at the effects of Covid on the industry and the emergence of the many disruptive changes to live production workflows. This panel will dive into a number of workflow innovations throughout sports, esports, entertainment, corporate, concert, awards, and other events that have, or soon will, become part of accepted workflows. Panelists share what’s new, what worked, what maybe didn’t work, and how the way we all do “live production” will never be the same as it was.

Panelists include Scott Rothenberg, NEP Group; Phil Garvin, Mobile TV Group; Tony Cole, NFL Media; and Wileen Charles, Crown Media Family Networks/Hallmark Channel.

Other topics covered at the HPA Tech Retreat include the notion of systems integration in the cloud. Many have posited that once in the cloud, you no longer need a systems integrator. On Thursday at 11:50 am – 12:05 pm, Dave Van Hoy, president of Advanced Systems Group (San Francisco, Calif.) will host a session on “What Does Systems Integration Look Like in the Cloud.”

Matthew Goldman, now at the Sinclair Broadcast Group, will discuss “Broadcast HDR at 4:30 pm – 5:00 pm on Wednesday.

Finally, at the close of each day of the conference, at 6:15 pm – 6:30 pm, Annie Chang and Leon Silverman will host “What Just Happened?”, a review of the day.

As in years past, HPA members will come from all parts of the globe to offer their insight. Register quickly as hotel space has already sold out and registration, there is a cap, is sure to fill up.

You might also like...

The Big Guide To OTT - The Book

The Big Guide To OTT ‘The Book’ provides deep insights into the technology that is enabling a new media industry. The Book is a huge collection of technical reference content. It contains 31 articles (216 pages… 64,000 words!) that exhaustively explore the technology and…

Audio For Broadcast: Cloud Based Audio

With several industry leading audio vendors demonstrating milestone product releases based on new technology at the 2024 NAB Show, the evolution of cloud-based audio took a significant step forward. In light of these developments the article below replaces previously published content…

An Introduction To Network Observability

The more complex and intricate IP networks and cloud infrastructures become, the greater the potential for unwelcome dynamics in the system, and the greater the need for rich, reliable, real-time data about performance and error rates.

Next-Gen 5G Contribution: Part 2 - MEC & The Disruptive Potential Of 5G

The migration of the core network functionality of 5G to virtualized or cloud-native infrastructure opens up new capabilities like MEC which have the potential to disrupt current approaches to remote production contribution networks.

Designing IP Broadcast Systems: Addressing & Packet Delivery

How layer-3 and layer-2 addresses work together to deliver data link layer packets and frames across networks to improve efficiency and reduce congestion.