AI was a big topic at recent NAB Show.
AI is much more than just a passing buzzword; it will be a crucial driver of media technology spending in 2018 and beyond as companies seek to further automate their operations and build direct relationships with consumers – as the recent string of acquisitions demonstrates. According to IABM data, most technology users plan to deploy AI in content management, distribution and delivery. They will continue to invest in AI during 2018 to become more efficient and better understand their customers, driving loyalty and revenues.
In part two of this article, first published in the Journal of the IABM, a number of IABM members tell us how they are currently deploying AI in their product and service offerings and the benefits this is delivering to their customers. They also look forward to how AI will play an increasing role in the broadcast and media industry over the coming years. From the responses we received, AI is being brought to bear on practically every aspect of the media workflow already, and it’s set to go wider and deeper with every passing day.
Continued from part 1.
Across the board
Ooyala is looking to leverage AI across the entire media chain. Belsasar Lepe, CTO at Ooyala, says, “Media companies today are under tremendous pressure to meet the demands of a 24/7 always-on global audience. Production teams in particular are having to create more content, faster, with the same budget and resources. By introducing automation and AI into media workflows these companies can reduce manual tasks, eliminate bottlenecks and maximize the speed and efficiency with which media assets are brought to market.
“The Flex Media Platform automates every aspect of media workflows across the production lifecycle, from asset ingestion to review & approval, to monitoring and distribution. We see AI as an important evolution to each of these workflows moving forward as represented by our exciting integration with Microsoft Cognitive Services for enhanced metadata capture. With mature AI technologies like speech, face, object and text recognition, key metadata information can be captured in real-time and automatically enriched to maximize asset management, search, personalization and monetization,” Lepe continues.
“Specific to our integration of AI for metadata capture and processing, having access to enhanced metadata, faster can lead to a number of invaluable benefits. Not only will internal workflow processes run much faster and more smoothly, but the user experience can be greatly enhanced as well. Personalization is often a good example where the more data you have about an asset, the more you can cater to a user’s preferences and provide a truly unique experience. Another exciting example presents itself around live events where an AI program can immediately identify a key datapoint and react instantly. For example, if a movie star is shown in the crowd at a baseball game, AI can pull up that star’s most popular movie clip, check it for licensing and prep it for air… all within seconds,” Lepe concludes.
Today, AI is being mostly exploited in discrete operational processes, but, according to many of the correspondents in this article, it will grow to encompass the entire media supply chain, from camera to consumer. “The potential is huge,” says Interra Systems’ Anupama Anantharaman. “AI techniques at each stage can improve ‘how’ and ‘what’ gets done in the next or the previous stage. For example, using end-user usage data sets, an operator can determine that viewers are dropping off from watching a particular piece of content. If the reason is due to a transcode issue, then the operator can automatically fine-tune the transcoder, using the AI-generated feedback to remove such artifacts and boost viewership.” Tedial’s Jay Batista agrees: “AI will provide automated cameras to follow actors, manage basic editing decision making, and enable multiple “media” feeds for linear, non-linear and social media platforms. The end result will be a percentage of labour saved. The IABM puts that estimate at 30% reduction or more, by 2020.”
“We are only beginning to scratch the surface of what is possible with ML/AI,” says Brightcove’s Matt Smith. “As implementations of the technology continue to roll out, we will see multiple datasets being intermingled and leveraged to harness the power of ML to drive delivery and placement.For example, ML’s ability to analyze a video frame-by-frame and determine a particular element (automobile, branded coffee cup or soda can) in frame will then drive what ads are dynamically inserted into the stream, or inform a display ad platform to place a static ad on the page hosting the player to play an ad from that same company.”
IPV’s Nigel Booth shares this enthusiasm, but sounds a note of caution – based on real-life experience – that we have a way to go yet. “AI technologies will start to be used across multiple areas of the supply chain, from helping inform and steer content production right the way through to distribution. While this is exciting, it should be understood that these technologies are currently at a stage where they can give false results. At IPV we have run a number of projects which involve aggregating data from multiple datasets to improve the accuracy of AI-led results. For example, when an AI-led object recognition engine identifies an item in shot, it may or may not be correct. By combining this with data from the audio track, however, the engine can tell whether the audio also contains a mention of the item, helping it to improve accuracy.”
To put AI into a broader perspective, we asked our correspondents to look ahead five years and speculate on the role AI will be playing in broadcast and media – disruptor or just another passing phase?
Wazee Digital’s Andy Hurt sees metadata underpinning the industry and AI as the means to exploit it: “Much like the cloud has become a major disruptor in M&E, I believe that AI is the next big wave of technology that can further optimize the legacy broadcast workflow. At the end of the day, the curation of metadata has become almost as important as the content itself — some have argued even more important. Without proper metadata schemas and standards, the entire “lens to lens” workflow becomes cumbersome and can lead to utter failure. I see AI as a driving force in the evolution of metadata creation.”
Imagine Communications’ Glodina Lostanlen also sees data and metadata as key going forward: “As AI capabilities increase and more and more visual information is turned into data points, i.e., metadata, the impact of AI on the broadcast industry will only expand – permeating nearly every workflow.”
For Conviva, it’s all about data too. “As the business of TV migrates from traditional broadcast and pay TV models to pure internet OTT delivery, the measurement of TV will also migrate from a panel-based approach to a true census-based approach,” says Lindsey O’Shea. “The massive scale of data produced from this census requires artificial intelligence to both process and make sense of the data in real-time; AI will be a fundamental technology underpinning this. The AI models will evolve from today’s mostly detection and diagnoses-based approaches to more predictive models that will help avoid problems and additional expense associated with delivering video on the internet before they happen.”
Brightcove’s Matt Smith sees AI/ML as “A considerably large and important element in the video experiences of the future. We will look back and wonder what we did before the advent of machine learning, because it will have added scale, enabled the processing of massive amounts of data and created massive shifts in how video is processed and delivered. In short, AL/AI will make video processing, delivery and monetization ‘smarter’ than it ever has been.”
EVS sees AI as fundamental to its customers’ future. “EVS isn’t engaging in developing AI for the sake of it. It has a long-term vision that aims to support our customers in making more, higher-quality personalized content. We also make sure there’s a business need that won’t go away. It’s with that that EVS has identified AI as a major building block to realize this long-term vision,” says EVS’ Johan Vounckx.
Ooyala also envisions AI permeating every aspect of our lives. “Similar to the Industrial Revolution, which represented a massive overhaul of manufacturing processes over decades and across centuries, the AI Revolution will usher in new technologies over an extended period of time until it becomes commonplace as the backbone of our global society. The M&E industry is, so far, on the forefront of the proliferation of AI technology, no doubt due to the amount of ad revenue and production spends that stand to be optimized, which is why Ooyala is excited to count ourselves as early innovators in this bold new world!” Belsasar Lepe, Ooyala CTO, concludes.
You might also like...
TV stations have mostly parked their satellite trucks and ENG vans in favor of mobile bi-directional wireless digital systems such as bonded cellular, wireless, and direct-to-modem wired internet connections. Is Starlink part of the future?
The technology used to create deepfake videos is evolving very rapidly. Is the technology used to detect them keeping pace and are there other approaches broadcast newsrooms can use to protect themselves?
Scalable Dynamic Software For Broadcasters is a free 88 page eBook containing a collection of 12 articles which give a detailed explanation of the principles, terminology and technology required to leverage microservices based, software only broadcast production infrastructure.
John Watkinson continues his exploration of the potential for a true motion tv system that requires the complete removal of frame sampling to make each pixel a continuous representation of the image thus removing motion artefacts.
Moving beyond the use of three primary colors could significantly increase the range of colors we can reproduce. There is no doubt it could improve the viewer experience but what are the barriers to adoption?