Broadcast Graphics in the Eye of Brainstorm

Brainstorm’s eStudio technology allowed Italian broadcaster RAI to incorporate live graphics for “The Voice."
For the television viewer, no production technology has advanced as significantly, and improved the viewing experience more profoundly, than live graphics. Helpful information that once was displayed as simple text and titles has become a whirlwind of motion effects, 3D elements and augmented and virtual reality (AR/VR) images that bring new life to a news or entertainment show, increase audience retention and boost ratings. We’re seeing new and better graphics with literally every major broadcast, and that’s changing the audience expectation for everything that follows.
Brainstorm, a graphics systems provider headquartered in Madrid, Spain, has been at the forefront of this visual evolution, providing ever more complex graphics systems for a wide range of telecasts, from live sports and entertainment to real-time elections coverage (in the U.S. and abroad). In fact, for his final college engineering project, company founder and CEO Ricardo Montesa developed specialized software (called “IPF”) in the late 1980’s to support the on-air display of real-time 2D and 3D titles and graphics. By 1993, he had founded his own company, called “Montesa Grafics” and began to refine his product to meet major broadcast TV demands.
The following year, after countless trials and conversations with broadcasters in Spain, Montesa renamed his company Brainstorm and introduced its first product, virtual set software called eStudio, which many consider a benchmark product in 3D graphics rendering. It was also in 1994 that Brainstorm supported what it says was the first time real-time 3D virtual set ever used in live production in the history of television. The event featured a live interview on Antena 3 TV in Spain with Mike Oldfield for the promotion of his album “Songs from the Distant Earth.” The well-known musician sat in a virtual set studio at Antena 3 TV in Madrid and viewers saw a simulated spaceship surrounded by planets while journalists situated in the Madrid Planetarium, miles away, interviewed him.
Two days after this milestone live show, on November 12th, 1994, Brainstorm started work on a daily live weather show in Spain, the first complex daily show of this kind that included a 3D virtual set, live talent, 3D graphics, live camera movements and live weather data feeds, all together in the same production.
“This technology represented a step beyond traditional Chroma keying,” Montesa said recently, adding that the show received a European Award for the best weather presentation. “With the introduction of 3D real-time graphics and complex camera movements, that significantly enhanced the show’s presentation and impact, and required the concourse of an SGI Onyx to generate the computer power required to render all the graphics, camera movements while integrating the talent on the set.”
In 2009, Brainstorm bought a character generation company called Aston, which was already using the eStudio as its compositing system for a new 3D CG system it was developing. The acquisition of Aston proved to be a shrewd move, as Brainstorm not only gained new technology but they also added a team with extensive knowledge in character generation and live production applications. This proved to be invaluable in furthering the development of the eStudio engine, enhancing its combination of reliability and simplicity of use while adding live production-oriented features.
The acquisition helped herald the introduction of the Aston 3D system in 2013, which builds upon the Aston heritage of reliability and simplicity and the graphics power of eStudio to create a totally new product that perfectly fits both in design and production environments. Brainstorm has since created a complete new Aston product line that is ideal for 2D/3D motion graphics. The eStudio render engine has now become the core processing system for all of Brainstorm’s products.
Fast forward to last season’s “The Voice” in Italy, where broadcaster RAI incorporated live graphics into the live program in ways never before seen on live television. RAI used eStudio for “The Voice” and other Augmented Reality applications from Brainstorm for RAI’s other shows and major events coverage, like the Summer Games in Rio 2016.

Brainstorm has incorporated Aston technology into many of its new products, such as its InfinitySet 3 virtual set technology.
For “The Voice,” RAI used specifically designed Augmented Reality (AR) graphics for each song and singer, providing the audience with an amazing immersive experience of flying words and attractive scenes of modern architecture and animated emojis, all perfectly integrated and synchronized with the live video. While artists were performing in the theater’s large stage, the audience at home saw them surrounded by synthetic elements that did not affect the artist’s singing performance, but significantly enhanced the experience at home.
To achieve such effects, RAI used Brainstorm’s eStudio with tracked cameras and a crane with special motion sensors that provided a series of impressive sweeping flyovers across the real stage. The eStudio systems received the live feeds and tracking data from the cameras in order to render the final output composition including all of the AR objects and animations.
“The experience would not be realistic if the perspective was not correctly matched with the camera view,” Montesa said. “At home, the audience received complex shots with camera flies where the AR objects seamlessly integrated with the real background because of its correct perspective and finely tuned surface and material management.”
Brainstorm also now offers its InfinitySet, a standalone solution for graphics and virtual set applications. With its patented TrackFree technology and a series of advanced features like TeleTransporter, 3D Presenter, HandsTracking or VirtualGate, which all help to blur the lines between the real and virtual worlds.
With the TeleTransporter feature, InfinitySet can leverage real, live or pre-recorded footage as the background set for the chroma-keyed talent. This permits a remote talent to become a virtual traveller and be “teletransported” to any location at any time, and seamlessly interact with real and virtual elements.
“This interaction is made possible thanks to the 3D Presenter feature which transforms the talent into a full 3D object, a real-time 3D volume that is continuously regenerated, repositioned and remapped based on the camera parameters,” Montesa said. “What Brainstorm has developed with the 3D presenter feature is a technology that allows the real-time extrusion of the video layer of the presenter to create a 3D object with real volume and correct shape in the virtual set. Therefore the chroma keyed character is no longer a video layer but an extruded 3D object, which can interact properly with the virtual elements in the scene (dropping real shadows, intersecting correctly with the augmented reality objects…) as it is an object in the scene.”

InfinitySet can use external render engines such as Unreal Engine integrated with the Brainstorm eStudio render engine to allow virtual and AR graphics elements to be integrated into a live scene.
Other new features, like VirtualGate, allow the presenter in the virtual set to walk into a virtual screen or into the featured news (which can be shown on a virtual video wall or separate displays mounted on a production set) and become part of the video itself with full realism. The talent enters and exits the video with accurately matched perspective.
“And once inside the clip, it behaves correctly in terms of spatial reference and with the inclusion of realistic shadows, defocus etc.,” he said.
On top of that, InfinitySet can use external render engines such as Unreal Engine integrated with the Brainstorm eStudio render engine. This allows InfinitySet not only to show fully rendered and realistic background scenes, but also to integrate graphics elements in the final scene, such as 3D motion graphics, lower-thirds, tickers, CG and many other elements.
“Aston is our 2D/3D motion graphics creation solution, and features many highly advanced technologies to allow designers to create any kind of graphic, static or animated, or even driven by external data feeds,” Montesa said. “InfinitySet has always been 100 percent compatible with Aston projects, meaning these can be seamlessly placed into the virtual set, re-located in space, resized, etc. This provides additional flexibility when creating virtual set programs, as it allows adding graphics, augmented reality and CG (tickers, lower thirds, etc.) directly from InfinitySet, and treat those elements as 3D objects in the set. “
At the recent 2017 NAB Show, Brainstorm introduced InfinitySet 3, which allows users to directly edit not only the projects but the individual elements, animations, StormLogic and data feeds as if they were in Aston, or even create new elements within the composition, change its attributes and much more. To achieve this, InfinitySet now includes Aston’s graphics toolset so it can manipulate any object in the project, its properties, attributes and animations. This toolset allows InfinitySet users to create any motion graphic element from scratch, if required. This means that during intensive broadcast operations setup, or while on-air, InfinitySet operators can adjust any object in the scene—even those with added properties such as animations or external data feeds.
In the end Montesa said his team has spent “thousands of hours” ensuring that their products are reliable and easy to use. In fact, others in the company have said that Montesa has mandated that all system commands have to be completed in three keystrokes or less.

InfinitySet now includes Aston’s graphics toolset so it can manipulate any object in a project, its properties, attributes and animations.
“We have spent lots of time with operators trying to understand the best way to control graphics in different scenarios like live production, news, quiz shows, etc.; all of them with specific operation requirements,” Montesa said. “Sometimes you need a rundown, maybe a fast random access to pages or why not, a bespoke panel with buttons for a quiz show. But at the end of the day, control devices need to be quick and reliable and to achieve that we established a simple rule: any command has to be carried out in less than 3 key strokes, and this is valid for both Mouse and/or keyboard.”
So, what graphics technology is Brainstorm looking to develop for the future? Montesa said that the goal for his team is to push the limits of current GPU technology and provide new software features to create the most realistic output possible.
“Viewers at home must receive good looking, good quality content, and when mixing real and virtual worlds, the continuity between them—in terms of image quality and seamless integration of all the elements involved—are paramount,” he said. “In terms of image quality, we are already supporting 4K real-time 3D graphics and Augmented Reality, and are at the forefront of 8K developments—which some broadcasters are undertaking in different parts of the world.
“On top of that,” he said, “we are always looking for new possibilities to deliver content, and that is why at NAB we announced support for ‘Mixed Reality’ [VR] hardware such as Oculus Rift, and by supporting this new approach to immersive technologies, Brainstorm wants to be instrumental in changing the game on how content is consumed at home. By using Mixed Reality, broadcasters aim to revolutionize content consumption at home, by allowing the audience to immerse themselves into the broadcasted content, no matter if it is News, Sports or Entertainment. If this is so, we want to be part of it.”
You might also like...
System Showcase: Gravity Media Opens New Production Centre In London For Remote And On-Prem Projects
Production and media services company Gravity Media has just completed the first phase of technology integration at its new London Production Centre White City, located on the fifth floor of The Westworks building in London. The state-of-the-art facility, officially opened…
Interlace: Part 3 - Deinterlacing
Now that interlace is obsolete, we are left only with the problem of dealing with archive material that exists in the interlaced format. The overwhelming majority of video tapes, whether component or composite, analog or digital, would be interlaced.
Magicbox Puts Virtual Production Inside An LED Volume On Wheels
Virtual production studios are popping up across the globe as the latest solution for safe and cost/time-effective TV and movie production. This method replaces on location shooting and, by utilizing all-encompassing LED walls (often called “volumes”), is fundamentally changing the…
Celebrating BEITC At NAB Show
As we approach the 2023 NAB Show in the NAB centenary year, we celebrate the unique insight and influence of the Broadcast Engineering & IT Conference that happens alongside the show each year.
Interlace: Part 2 - Vertical Resolution
The human eye is not fixed and so it can track moving objects in real life and on screens. The tracking action changes everything. Without an understanding of tracking everything seems peculiar. With an understanding it seems obvious why certain…