Sony Introduces New OCELLUS Camera Tracking System

Sony Electronics is launching its first camera tracking system, OCELLUS (ASR-CT1), specifically designed to simplify and facilitate augmented reality and virtual production applications in broadcast and cinema by providing marker-free camera tracking through multiple sensors. OCELLUS is camera agnostic and can be used with both cinema and broadcast cameras.

OCELLUS, perfectly suited for virtual production such as In-Camera VFX and AR, sends the camera position and orientation data while the camera is shooting. The system comprises a sensor unit, a processing box, and three lens encoders, and can be used with Sony Cinema Line cameras, system cameras, and non-Sony cameras.

With the five image sensors and Sony's Visual SLAM (Simultaneous Localization and Mapping) technology, the system creates a reference map[1], enabling stable marker-free tracking both indoors and outdoors.

When using Sony cameras, metadata regarding about focus, iris and zoom values from the camera and lens[2] can be obtained via the camera's SDI[3] output and transmitted in real-time to external devices via Ethernet cable[4]. If the lens does not support metadata acquisition through the camera, lens encoders can be affixed to the camera to obtain this metadata. The acquired metadata can then be used for virtual production and AR.

The system also supports recording tracking data, camera/lens metadata, timecode and file name, which can be used for the post-production workflow.

Sony continues its unwavering support to creators in the Virtual Production and AR space, with tools ranging from acquisition to display screens such as the Crystal LED VERONA and software-based solutions such as the Virtual Production Toolset.

Key Features

Compact and lightweight sensor unit with five image sensors: 

  • Four image sensors out of five on the sensor unit are selected to use, providing stable marker-free tracking and high occlusion resistance, critical for operations.
  • If at least one image sensor in use captures valid feature points, tracking data can be extracted.
  • IR LEDs on both sides of each image sensor help tracking in low-light environments.
  • Visible Light Cut Unit included for stable tracking in environments with frequent lighting changes.
  • Sensor unit dimensions: approx. 86 mm × 60 mm × 43 mm (W × H × D) (3.39" × 2.36" × 1.69"), weight: approx. 250 g[5].
  • Easy installation and position adjustment using the NATO rail mounting parts (included).
  • Connection to the processing box via a single USB Type-C® cable with a lock mechanism, powered by the processing box via same USB Type-C® cable.

Processing Box:

  • Real-time transmission of tracking data and camera[2]/lens[3] metadata to CG rendering software like Unreal Engine via Ethernet cable[6] in free-d format.
  • Equipped with Genlock input, Timecode input, SDI input/output terminals, and lens encoder connection terminals.
  • Supports recording tracking data and camera/lens metadata as FBX files on SDXC memory cards (UHS-II/UHS-I) synchronized with video files of main camera.
  • OLED display for checking IP address, tracking information, lens data, and more.

Lens Encoder:

  • Detects precise rotation angles and positions of lens focus, zoom, and iris values.
  • Transmits detected data to the processing box via LEMO 7-pin cable.
  • Enables metadata acquisition for lenses and cameras not supporting lens data embedding on SDI output.
  • Includes five different types of gears for various lenses.

 


[1] There are limitations to the size of the map.
[2] Supports Cooke /i lenses, B4 lenses, and E-mount lenses.
[3] The camera must support metadata embedding on SDI output.
[4] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.
[5] Please note that the final specifications may differ.
[6] Compatible with 1000BASE-T, 100BASE-TX, 10BASE-T.

You might also like...

Live Sports Production: Camera To Truck

Much of the OB production infrastructure has moved to IP, but has the connectivity between the cameras and the OB or backhaul also migrated to IP?

Big Chip Cameras For Broadcast: The History Of The Camera Sensor

Understanding the motivations and implications of using large sensors in broadcast, demands an examination of the historical relationship between cinema and broadcast camera technology & creative production techniques.

Immersive Audio 2025: The Rise Of Next Generation Audio

Immersive audio has gone spatial with the addition of height control and NGA formats to support it, and consumer demand is booming… but it is all still about the experience.

Live Sports Production: Exploring The Evolving OB

The first of our three articles is focused on comparing what technology is required in OBs and other venue systems to support the various approaches to live sports production.

Cloud Compute Infrastructure At IBC 2025

In celebration of the 2025 IBC Show, this article focuses on the key theme of cloud compute infrastructure and what exhibitors at the show are doing in this key area of technological enablement.