StarTracker VFX from Mo-Sys

Famed for camera tracking and robotic technology, the Mo-Sys StarTracker VFX system reports the position and orientation of a studio camera in real time to a 3D graphics rendering engine.

“Camera tracking is really the marriage of imaging technology with an artificial computer generated world,” Michael Geissler founder of Mo-Sys began our interview direct from London. “But after we made our mark first with broadcasters and then with most of the major Hollywood studios, we started to think that there must be an easier way.”

What they came up with, the StarTracker VFX system, takes a bit of explaining, but if you’ve seen the video at the head of this article you will see that it’s worth the effort.

You lift a hand held monitor, like a laptop screen, in front of you on a green screen stage, and on it you see the c.g. created world in the background with actors in their 3D-modeled character composited in real time into the scene. Then you can move into, around and through the scene holding that screen like a camera viewfinder and record what you are seeing.

Camera mounted with the StarTracker VFX sensors.

Camera mounted with the StarTracker VFX sensors.

Once Geissler started describing the StarTracker VFX system to me, I realized I had seen this before.

In March of 2012, on a press tour of Industrial Light and Magic, George Lucas’s VFX factory, the journalist gaggle had been brought into a special stage to see something unique. It was the “Virtual Camera” that James Cameron and his Director of Photography, Vince Pace, had invented for the filming of “Avatar”.

Having interviewed both Cameron and Pace before the film’s release, and researched this remarkable technology they came up with, once I saw the actual thing at ILM I had to count to 10 before jumping at the chance to actually get my hands on what may have been one of the very Virtual Cameras that had soared the Hallelujah Mountains on Pandora.

Jay holding a Virtual Camera at ILM in 2012. Film buffs will recognize the c.g. background from

Jay holding a Virtual Camera at ILM in 2012. Film buffs will recognize the c.g. background from "Rango".

The Cameron/Pace invention cost many millions of dollars to realize, but as Geissler told me, “We know what we have done is not new,” he said. “However, we’ve made this kind of advanced motion capture technology affordable to even smaller production and post production houses to use.”

While shooting “Avatar”, Cameron’s crew had to surround their green screen sets with very expensive camera sensors and link them all to massive central computers.

The Mo-Sys approach is brilliantly simpler.

First, you paste small, identical retroreflective star-shaped stickers all over the ceiling of the studio. The position of these stars don’t even have to conform to any set pattern, and they are so small that it’s pretty hard for the eye to see them.

A green screen set with the StarTracker VFX camera.

A green screen set with the StarTracker VFX camera.

Then they are illuminated by a small LED sensor mounted on the studio camera, key to the StarTracker system, which reports the position of that star map to the rendering engine in real time.

This is one area where Geissler’s wizards are taking advantage of advances since the days of “Avatar” by using the vastly improved rendering power of the Unreal Engine developed by Epic Games.

Then, in the past, little reflective balls were put on the bodies of the actors, so the computer can track them too, and composite them into the scene.

One great advantage is that since the tracking sensor is pointed toward the ceiling, it does not get confused by moving objects, set changes, lighting configurations or reflections.

“At IBC in the Future Zone we showcased one step we took that goes further than what Cameron did,” Geissler continued. “Inside the actors’ suits we now position miniature MEMS (Microelectromechanical Systems) inertial sensors made by Xsens that send positioning information back to the computer sort of like tiny gyroscopes. This gives the rendering engine data that no longer has to be cleaned up, which cost so much time back in the Virtual Camera approach.”

The ultra small Xsens inertial sensor.

The ultra small Xsens inertial sensor.

One reason that the StarTracker VFX system is catching on so widely is that once the constellation of star stickers has been established, you never need to re-calibrate anything. It just works when you need it and can be ignored when you don’t.

If you want to use the system outdoors, the stars can be positioned on a carpet that gets rolled out on location, and then the camera’s sensors simply look down.

Inside the StarTracker VFX toolbox are three units: the main camera, the Xsens suits with their mini sensors, and the viewfinder screen.

“The beauty is that all of these pieces reference the same star constellation,” Geissler said, “and since everything relates to each other that takes a lot of the ambiguity out of really advanced special effects.”

He also told me that at IBC for the first time attendees saw a StarTracker VFX system built right inside of a Grass Valley camera.

“At Mo-Sys we are working to move Virtual Reality production out of a niche industry into mainstream filmmaking,” Geissler finished up.

How the StarTracker VFX system works

You might also like...

Sports Re-Starts Without Fans And The Virtual Crowd Goes Wild

In mid-May of this year, as countries such as Germany, England, and Spain considered easing COVID-19 restrictions to allow professional sports to resume, various professional sports leagues began discussions with broadcasters and production companies on the best way to televise…

The Sponsors Perspective: How HDR Has Blurred Lines Between TV And Cinema

Twenty years ago, there was a clear divide between how you shot and finished a project for Cinema compared to the typical workflows used in broadcast TV. With the advent of streaming services that provide 4K/UHD to a broad…

WFH: Tech Choices From Edit To Approval

You have two key choices for remote technologies: those that give you remote access into machines and devices at your facility or those based in the cloud. Depending on your needs it may be sensible to take a mixed approach.

Essential Guide: HDR For Cinematography

High dynamic range and wide color gamut combined with 4K resolution and progressive frame rates have catapulted broadcast television to new levels of immersive experience for the viewer. As HDR and WCG are relatively new to television, we need to…

WFH: Baby Steps And Lasting Legacy

Until very recently, the idea that editors and VFX artists could work remotely from one another seemed a far-off reality. Yet, Work From Home measures mean media companies have had to pivot overnight to a remote work setup. Content creation…