Mo-Sys Partner With VividQ For Next-Generation AR Displays

Mo-Sys has teamed up with VividQ, pioneers in computer-generated holography for next-generation augmented reality (AR) displays.

This allows 3D holographic projections to be placed precisely in real space, enabling users of future AR devices, like smart glasses, to explore virtual content in context with the natural environment.

Mo-Sys StarTracker is a proven and powerful camera tracking technology, widely used in television production and other creative environments for applications from virtual studios to realtime set extensions. It provides precise location for the camera in XYZ space, and with free rotation.

VividQ software for computer-generated holography is used in innovative display applications from AR wearables, to head-up displays. Holography - the holy grail of display technologies - relies on high-performance computation of complex light patterns to project realistic objects and scenes, for example in AR devices. VividQ generates holographic projections which, thanks to the precision location of Mo-Sys, can be displayed to the user at the correct place in the real environment. This is a major advance on today’s AR devices where flat (stereoscopic) objects are mismatched with the real world. By presenting holographic projections with depth, the user’s eyes can focus naturally as they scan the scene.

“The possibilities and applications of augmented reality in realtime devices are only just being explored,” said Michael Geissler, CEO of Mo-Sys Engineering. “We are at the cutting edge of camera tracking; VividQ is at the cutting edge of computer-generated holography, and we are excited to work together to bring some of these concepts to reality.”

Darran Milne, CEO of VividQ added “Our partnership with Mo-Sys is key to understanding the potential of computer-generated holography in future AR applications, developing experiences where virtual objects can blend seamlessly into the real world.”

You might also like...

The Cinematographers View On The 2024 NAB Show

Our resident cinematographer and all-round imaging expert Phil Rhodes walked the floor at the 2024 NAB Show and this is what he made of it all.

Next-Gen 5G Contribution: Part 1 - The Technology Of 5G

5G is a collection of standards that encompass a wide array of different use cases, across the entire spectrum of consumer and commercial users. Here we discuss the aspects of it that apply to live video contribution in broadcast production.

Virtual Production At America’s Premier Film And TV Production School

The School of Cinematic Arts at the University of Southern California (USC) is renowned for its wide range of courses and degrees focused on TV and movie production and all of the sub-categories that relate to both disciplines. Following real-world…

Why AI Won’t Roll Out In Broadcasting As Quickly As You’d Think

We’ve all witnessed its phenomenal growth recently. The question is: how do we manage the process of adopting and adjusting to AI in the broadcasting industry? This article is more about our approach than specific examples of AI integration;…

Designing IP Broadcast Systems: Integrating Cloud Infrastructure

Connecting on-prem broadcast infrastructures to the public cloud leads to a hybrid system which requires reliable secure high value media exchange and delivery.