Virtual SMPTE 2020 Targets Global Community With Interactive Platform

Like most industry gatherings this year, the 2020 SMPTE show is virtual and runs from November 10-12, complete with an interactive environment that incorporates a main conference hub, meeting rooms, theater space for sessions and the annual awards gala, and an exhibition hall with private meeting spaces. Many of the events, in addition to the Awards Gala, have been pre-recorded so attendees can view them at their leisure.

Veteran technologist Bruce Devlin, who also serves Standards Vice President of SMPTE (The Society Of Motion Picture & Television Engineers), said going virtual will allow the conference to be truly global, enabling more people from distant places to participate in the annual conference that has been held in Los Angeles for the past several years.

“We wanted to make certain this was a global show that is a bit different than our physical shows in the past,” he said. “Previously we’ve had attendees from around the world, but the percentage compared to U.S. attendees was always very low. Because this is a virtual event, we wanted to show this inclusivity by making sure that there are sessions available during the night for Europeans and in prime time for the Asian community. So everyone who wants to can participate.”

The show’s creators also wanted to give their online platform a unique look and feel and indeed the highly interactive interface—with 3D visuals developed by Storycraft Lab, in New York, from an original design concept by SMPTE staff—straddles the line between an eSports gaming environment and the traditional media world with various areas so that attendees can click and move to the session or vendor demo they desire. Content will remain online for 30 days after the 12th for paid attendees.

The online platform features a series of gaming-type portals that lead to various sessions and tutorials.

The online platform features a series of gaming-type portals that lead to various sessions and tutorials.

One of the big topics this year will be remote production, a method of producing content from a distance, including a session that will discuss the required connectivity and what remote production actually means in the real world. Another will highlight how do you control latency and the various networks involved in a production.

Devlin himself will be part of a panel discussion on Tuesday, Nov. 10th at 3:15 PM - 3:45 PM entitled “Are standards still relevant in a post-COVID world?," along with Thomas Bause Mason. The session will look at the changing face of the media technology industry and explore a future in which Open Source, Standards and Cloud fulfill different, but essential roles in creating an interoperable global eco-system.

During this opening session Devlin and Mason will also discuss how SMPTE’s future relies on good relations with other technology and standards groups, like the ATSC, IEEE, DVB, etc. in order to develop universal standards and foster collaboration between the two entities.

“The pipeline between the photon going into the camera and the photon coming at you off the screen is getting shorter and shorter,” said Devlin. “So, we’re focused on building a better relationship with organizations that have similar goals of helping users develop better production and distribution methods.

Speaking of standards, SMPTE recently published its VC6 encoding standard and its MPEG-5 LCEVC sibling standard. Both of these encoding methods, which cover the top end, lossless, super high quality production all the way down to enhanced distribution, will be discussed during the online show.

Machine Learning (ML) will be part of a Keynote presentation given by Anima Anandkumar on Tuesday, Nov. 10 at 5:00 PM that will look at the role of ML in the media industry. 

Attendee collaboration is a big part of what SMPTE is trying to accomplish with its online conference.

Attendee collaboration is a big part of what SMPTE is trying to accomplish with its online conference.

In fact, SMPTE has formed a new taskforce on ML and artificial intelligence (AI) in the media space and is working with the Entertainment Technology Center— at the University of Southern California's (USC) School of Cinematic Arts—to develop new workflow processes. There will be a Standards Session during the show entitled SMPTE Task Force, which will be presented by Yves Bergquist, Frederick Walls, on Wednesday Nov. 11 at 3:00 pm.

ML is also a topic that is near to Devlin’s heart, as, in his other role as owner of MR MXF consultancy, he has been working with clients to figure out how to harness its power to streamline a wide variety of workflows.

“I think there’s a whole bunch of stuff that’s really hard for humans to do that machines can accomplish so much faster and more accurately,” he said. “So, generating metadata to identify what actor is in a scene or whatever, that’s got to be done by humans today. There’s no reason that can’t be done by machines.

Some of his clients are using ML for things like video filter optimization. So, if you want to upconvert an image, you’d typically take an old SD image that’s interlaced and process it using a filter. The problem is that different countries have different frame rates, so the content distributor has to find the right filter to get the best image quality.

“Upconverting that to progressive 4K images that do not look terrible is quite hard,” said Devlin. “What you end up doing is experimenting with 40 or 50 different filters and try to figure out which one looks best. It’s really tedious for a human being to look at hours of content and judge which ones are best. Alternatively, we can teach a machine learning algorithm what ‘best’ looks like and you can give it 40 different filters and it will apply the images to all of them in a fraction of the time it would take a human.”

He’s also excited about metadata and the increased role it will play in the future of the broadcast industry. Embedding metadata allows the user to deliver entertainment in new and interesting ways, and, he said, it will only become more common as we move forward. Devlin is trying to figure out how to take on-set exotic metadata and get it into the production pipeline for minimum cost. He tested the technique while recording a concert in Germany this summer.

“I hope that metadata gains some respect and not be overlooked like a third-class citizen, as it has been,” he said. “Metadata has always been the poor step child. But Metadata can be used to deliver better content, to select better filters, to correlate different bits of a production. Moving metadata to the front of the line will have a massive impact on how we create content and how we consume content.

“In general this show will be talking about the core concept of what it means to be a broadcaster these days,” said Devlin. “Part of what we’re trying to do is make certain that those that haven’t been paying enough attention to what is going on can easily find the information they need.”

You might also like...

Next-Gen 5G Contribution: Part 1 - The Technology Of 5G

5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.