Data drives on-air sports analysis

Advances in processing power and high-performance graphics cards are affording broadcasters more possibilities than ever to devise stunning virtual sets and on-air graphics, but it is the power of data which is driving most innovation. Nowhere is this more evident than in live sports. Ratings sky rocket during major league finals or international sporting extravaganzas, increasing viewer loyalty. The graphic look of the production has become a major factor in the station’s identity and brand and sports graphics a vital component of the televised coverage.

Going beyond merely displaying the score and the game’s clock, today's live sports graphics are all about displaying relevant and accurate data - fast.

“In broadcast the two most valuable properties are sports and news which, by their nature are live and live content is king,” says ChyronHego president and CEO Johan Apel. “Live sports attracts very large audiences, driving its commercial value and in turn that compels broadcasters to innovate technology. Whether that's HD, 3D or 4K and virtual graphics, new technology has always been important to reaching audiences with the best possible televised experience.”

Realtime graphics linked with statistics collection systems help the commentator and on-air talent to better analyse the game and similar data sets can be used to help viewers visualise the action or for teams themselves to use in training and tactics.

“The gathering of professional sports data is increasing all the time, in the same way that all around us there are more and more sensors coming on stream collecting information about how we run our lives,” says Apel. “Having large amounts of data on its own is not valuable necessarily. You need to enhance the data with information. Data needs to be processed, curated or analysed in some form to become relevant to the delivery channel – whether that is a website, to coaches or broadcast.”

A player tracking camera system

ChyronHego's optical tracking solution is deployed at German and English football stadia. The next step is to incorporate not just positional tracking but tracking the exact facial position and then arm and leg movements of individual players.

To shortcut the R&D, in April this year the company acquired ZXY, a Norwegian developer with a cutting-edge transponder technology. ZXY's tech detects the exact position of all objects within a stadium or training facility in realtime. Players wear a data chip that transmits positions and other essential data such as heart rates, speeds and the impact from collisions. The system has been used in soccer, American football and ice hockey, and is now being adapted for others sports integrated into ChyronHego's own sport tracking product TRACAB.

This system uses image processing technology to identify the position and speed of moving objects within arena-based sports. The X, Y and Z coordinates are supplied 25 times every second for each viewable object (e.g players, referees or ball).

ChyronHego also teamed with TrackMan, a provider of radar technology, to deliver baseball player and ball tracking technology. This non-invasive offering provides data for player evaluation, coaching and fan experience analysis.

“Big Data trends driven by cloud and mobility are creating a new style of sports analysis that is transforming what coaches and fans expect and need from sports technology,” says Apel.

Data is also key to innovation at Norwegian headquartered Vizrt. Its image-based camera tracking solution Viz Arena, makes it possible to apply graphics on the field without mechanical tracking heads. Data-driven graphics, sponsor branding and analysis content can be added live during the match. The software can be integrated with sports data feeds provided by deltatre, STATS, or Opta to visualise live player tracking, starting grids, team badges, statistical data, heat maps and so on. For superimposing graphics onto unknown terrains as in golf, Viz Arena also integrates with laser measurement devices.

Contextual awareness is next. This is the idea that any audio and video element of the captured scene can be treated as individual assets or objects to be reconstituted according to the screen or environment in which the viewer is interacting. It would trigger new forms of automation and of final presentation of the media.

BBC R&D is exploring this in a series of experiments it dubs Perceptive Media, based on end to end IP networks.

“We can make content more responsive to the reception device,” says Phil Tudor, BBC R&D principal technician. “If you’ve a large screen you might have graphics, tickers or captions in different locations, but if you’re watching on a very small screen you may want those overlays to be rearranged or even not to appear at all. You could broadcast just the foreground or background objects if desired. The aim is to adapt the media without the viewer explicitly having to interact with it.”

You might also like...

Production Control Room Tools At NAB 2024

As we approach the 2024 NAB Show we discuss the increasing demands placed on production control rooms and their crew, and the technologies coming to market in this key area of live broadcast production.

Designing IP Broadcast Systems: Where Broadcast Meets IT

Broadcast and IT engineers have historically approached their professions from two different places, but as technology is more reliable, they are moving closer.

Audio At NAB 2024

The 2024 NAB Show will see the big names in audio production embrace and help to drive forward the next generation of software centric distributed production workflows and join the ‘cloud’ revolution. Exciting times for broadcast audio.

SD/HD/UHD & SDR/HDR Video Workflows At NAB 2024

Here is our run down of some of the technology at the 2024 NAB Show that eases the burden of achieving effective workflows that simultaneously support multiple production and delivery video formats.

Standards: Part 7 - ST 2110 - A Review Of The Current Standard

Of all of the broadcast standards it is perhaps SMPTE ST 2110 which has had the greatest impact on production & distribution infrastructure in recent years, but much has changed since it’s 2017 release.