StarTracker VFX from Mo-Sys

Famed for camera tracking and robotic technology, the Mo-Sys StarTracker VFX system reports the position and orientation of a studio camera in real time to a 3D graphics rendering engine.

“Camera tracking is really the marriage of imaging technology with an artificial computer generated world,” Michael Geissler founder of Mo-Sys began our interview direct from London. “But after we made our mark first with broadcasters and then with most of the major Hollywood studios, we started to think that there must be an easier way.”

What they came up with, the StarTracker VFX system, takes a bit of explaining, but if you’ve seen the video at the head of this article you will see that it’s worth the effort.

You lift a hand held monitor, like a laptop screen, in front of you on a green screen stage, and on it you see the c.g. created world in the background with actors in their 3D-modeled character composited in real time into the scene. Then you can move into, around and through the scene holding that screen like a camera viewfinder and record what you are seeing.

Camera mounted with the StarTracker VFX sensors.

Camera mounted with the StarTracker VFX sensors.

Once Geissler started describing the StarTracker VFX system to me, I realized I had seen this before.

In March of 2012, on a press tour of Industrial Light and Magic, George Lucas’s VFX factory, the journalist gaggle had been brought into a special stage to see something unique. It was the “Virtual Camera” that James Cameron and his Director of Photography, Vince Pace, had invented for the filming of “Avatar”.

Having interviewed both Cameron and Pace before the film’s release, and researched this remarkable technology they came up with, once I saw the actual thing at ILM I had to count to 10 before jumping at the chance to actually get my hands on what may have been one of the very Virtual Cameras that had soared the Hallelujah Mountains on Pandora.

Jay holding a Virtual Camera at ILM in 2012. Film buffs will recognize the c.g. background from

Jay holding a Virtual Camera at ILM in 2012. Film buffs will recognize the c.g. background from "Rango".

The Cameron/Pace invention cost many millions of dollars to realize, but as Geissler told me, “We know what we have done is not new,” he said. “However, we’ve made this kind of advanced motion capture technology affordable to even smaller production and post production houses to use.”

While shooting “Avatar”, Cameron’s crew had to surround their green screen sets with very expensive camera sensors and link them all to massive central computers.

The Mo-Sys approach is brilliantly simpler.

First, you paste small, identical retroreflective star-shaped stickers all over the ceiling of the studio. The position of these stars don’t even have to conform to any set pattern, and they are so small that it’s pretty hard for the eye to see them.

A green screen set with the StarTracker VFX camera.

A green screen set with the StarTracker VFX camera.

Then they are illuminated by a small LED sensor mounted on the studio camera, key to the StarTracker system, which reports the position of that star map to the rendering engine in real time.

This is one area where Geissler’s wizards are taking advantage of advances since the days of “Avatar” by using the vastly improved rendering power of the Unreal Engine developed by Epic Games.

Then, in the past, little reflective balls were put on the bodies of the actors, so the computer can track them too, and composite them into the scene.

One great advantage is that since the tracking sensor is pointed toward the ceiling, it does not get confused by moving objects, set changes, lighting configurations or reflections.

“At IBC in the Future Zone we showcased one step we took that goes further than what Cameron did,” Geissler continued. “Inside the actors’ suits we now position miniature MEMS (Microelectromechanical Systems) inertial sensors made by Xsens that send positioning information back to the computer sort of like tiny gyroscopes. This gives the rendering engine data that no longer has to be cleaned up, which cost so much time back in the Virtual Camera approach.”

The ultra small Xsens inertial sensor.

The ultra small Xsens inertial sensor.

One reason that the StarTracker VFX system is catching on so widely is that once the constellation of star stickers has been established, you never need to re-calibrate anything. It just works when you need it and can be ignored when you don’t.

If you want to use the system outdoors, the stars can be positioned on a carpet that gets rolled out on location, and then the camera’s sensors simply look down.

Inside the StarTracker VFX toolbox are three units: the main camera, the Xsens suits with their mini sensors, and the viewfinder screen.

“The beauty is that all of these pieces reference the same star constellation,” Geissler said, “and since everything relates to each other that takes a lot of the ambiguity out of really advanced special effects.”

He also told me that at IBC for the first time attendees saw a StarTracker VFX system built right inside of a Grass Valley camera.

“At Mo-Sys we are working to move Virtual Reality production out of a niche industry into mainstream filmmaking,” Geissler finished up.

How the StarTracker VFX system works

Let us know what you think…

Log-in or Register for free to post comments…

You might also like...

The Future Of Performance Capture, Gollum, Apes And Andy Serkis

Andy Serkis has become the face - and body - of performance capture. The technology has allowed the creation of more convincing and realistic looking fantasy creatures, notably Serkis’ own standout roles as Gollum in the Lord of the Rings…

Essential Guide: Live HDR Down-Conversion

Live sports productions are the natural home for HDR. The increase in luminance latitude combined with extended color space delivers an immersive experience never before witnessed by the home viewer. But backwards compatibility must still be maintained for legacy SDR…

Essential Guide: Software COTS For Real Time IP Broadcast

A major development has happened in the broadcast industry with the adoption of software running on COTS servers for processing uncompressed real-time video. Up to recently, this had not even appeared on the radar, but new technology evolution and innovation…

Essential Guide: Hybrid IP and SDI Test and Measurement

Broadcasters continue to see the benefits of IP and many are integrating piecemeal to build hybrid SDI-IP systems. At a first glance, monitoring of hybrid systems may seem to be just an extension of existing practices. However, the complex interaction…

Special Report: Super Bowl LIII – The Technology Behind the Broadcast

New England Patriot quarterback, Tom Brady, entered Mercedes Benz stadium in Atlanta, GA on February 3rd having already won five Super Bowl games. And through four-quarters of play, all delivered by a television crew of hundreds of technicians, sports casters…