Famed for camera tracking and robotic technology, the Mo-Sys StarTracker VFX system reports the position and orientation of a studio camera in real time to a 3D graphics rendering engine.
“Camera tracking is really the marriage of imaging technology with an artificial computer generated world,” Michael Geissler founder of Mo-Sys began our interview direct from London. “But after we made our mark first with broadcasters and then with most of the major Hollywood studios, we started to think that there must be an easier way.”
What they came up with, the StarTracker VFX system, takes a bit of explaining, but if you’ve seen the video at the head of this article you will see that it’s worth the effort.
You lift a hand held monitor, like a laptop screen, in front of you on a green screen stage, and on it you see the c.g. created world in the background with actors in their 3D-modeled character composited in real time into the scene. Then you can move into, around and through the scene holding that screen like a camera viewfinder and record what you are seeing.
Camera mounted with the StarTracker VFX sensors.
Once Geissler started describing the StarTracker VFX system to me, I realized I had seen this before.
In March of 2012, on a press tour of Industrial Light and Magic, George Lucas’s VFX factory, the journalist gaggle had been brought into a special stage to see something unique. It was the “Virtual Camera” that James Cameron and his Director of Photography, Vince Pace, had invented for the filming of “Avatar”.
Having interviewed both Cameron and Pace before the film’s release, and researched this remarkable technology they came up with, once I saw the actual thing at ILM I had to count to 10 before jumping at the chance to actually get my hands on what may have been one of the very Virtual Cameras that had soared the Hallelujah Mountains on Pandora.
Jay holding a Virtual Camera at ILM in 2012. Film buffs will recognize the c.g. background from "Rango".
The Cameron/Pace invention cost many millions of dollars to realize, but as Geissler told me, “We know what we have done is not new,” he said. “However, we’ve made this kind of advanced motion capture technology affordable to even smaller production and post production houses to use.”
While shooting “Avatar”, Cameron’s crew had to surround their green screen sets with very expensive camera sensors and link them all to massive central computers.
The Mo-Sys approach is brilliantly simpler.
First, you paste small, identical retroreflective star-shaped stickers all over the ceiling of the studio. The position of these stars don’t even have to conform to any set pattern, and they are so small that it’s pretty hard for the eye to see them.
A green screen set with the StarTracker VFX camera.
Then they are illuminated by a small LED sensor mounted on the studio camera, key to the StarTracker system, which reports the position of that star map to the rendering engine in real time.
This is one area where Geissler’s wizards are taking advantage of advances since the days of “Avatar” by using the vastly improved rendering power of the Unreal Engine developed by Epic Games.
Then, in the past, little reflective balls were put on the bodies of the actors, so the computer can track them too, and composite them into the scene.
One great advantage is that since the tracking sensor is pointed toward the ceiling, it does not get confused by moving objects, set changes, lighting configurations or reflections.
“At IBC in the Future Zone we showcased one step we took that goes further than what Cameron did,” Geissler continued. “Inside the actors’ suits we now position miniature MEMS (Microelectromechanical Systems) inertial sensors made by Xsens that send positioning information back to the computer sort of like tiny gyroscopes. This gives the rendering engine data that no longer has to be cleaned up, which cost so much time back in the Virtual Camera approach.”
The ultra small Xsens inertial sensor.
One reason that the StarTracker VFX system is catching on so widely is that once the constellation of star stickers has been established, you never need to re-calibrate anything. It just works when you need it and can be ignored when you don’t.
If you want to use the system outdoors, the stars can be positioned on a carpet that gets rolled out on location, and then the camera’s sensors simply look down.
Inside the StarTracker VFX toolbox are three units: the main camera, the Xsens suits with their mini sensors, and the viewfinder screen.
“The beauty is that all of these pieces reference the same star constellation,” Geissler said, “and since everything relates to each other that takes a lot of the ambiguity out of really advanced special effects.”
He also told me that at IBC for the first time attendees saw a StarTracker VFX system built right inside of a Grass Valley camera.
“At Mo-Sys we are working to move Virtual Reality production out of a niche industry into mainstream filmmaking,” Geissler finished up.
How the StarTracker VFX system works
You might also like...
In their latest hyper-realistic VR weather warning, The Weather Channel helps viewers better understand the potential dangers created by ice storms.
Captivating 3D graphics and electronically inserted field images have become a hallmark of every major live sporting event, but CBS Sports hopes to raise the bar during this year’s NFL Super Bowl LIII telecast on February 3, 2019. The sports network’s g…
Behind the more than 100 television cameras and an arsenal of the most advanced broadcast technology ever assembled, the anchors reporting the 53rd Super Bowl will concentrate on the ancient art of storytelling.
The Intel True View allows a production team to recreate selected clips in 3D from any vantage point in a stadium or even from a player’s perspective.
During Super Bowl LIII, the football action will be on the field. But a lot of the action will be enhanced by incredible new graphics, some virtual, that CBS is using to super charge the screen.