Left to right: Kuniaki Takahashi (translator). Masao Eriguchi, Mauricio Aracena, Jay Turcot, Marcus Kuehne.
The Entertainment Enablement session at SMPTE 2018 presented a wide spectrum of technologies that could enhance time spent in the driverless cars of the future.
The stage of the Entertainment Enablement session held at 4:14 during the Symposium on the Monday before the SMPTE 2018 Annual Technical Conference & Exhibition was filled with a panel of visionaries under the leadership of session chair, Michael DeValue. They had all peered into the future of driverless cars, and found a world of promise, prosperity ad maybe a bit of whimsy.
Masao Eriguchi, deputy general manager of the product planning dept. at the AI Robotics Business Group of Sony, lead off by telling the audience the car of the future could be thought of as an iPhone on wheels with passenger seats.
He went on to describe a concept car, the SC-1, built by Sony that, believe it or not, actually exists.
It’s basically a driverless golf cart with five 4K screens in place of windows, looking out as well as in. The driver looks at the road ahead via high resolution video, and if there is danger ahead, the car is capable of warning him/her by putting images of explosions in their path.
In addition, if a young girl walks by the side of the cart, the side screen will show her advertising that the autonomous circuitry thinks may be of interest to her.
The cart can drive itself, park itself, and basically operate without human interference.
The audience greeted the SC-1 with a bit of laughter.
Then Mauricio Aracena, standardization manager of media at Ericsson, discussed how the impending 5G networks are going to expand the possibilities of entertainment inside driverless cars. This could be combined with satellite media delivery or other digital inputs, since the 5G signal is robust enough to be sliced into multiple purposes.
“We may have different parts of the spectrum for different uses,” Aracena said. “5G is not just about high quality broadcasting, it is actually a whole new platform.”
Jay Turcot, director of applied AI at Affectiva got emotional over the subject. That is, he speculated that the car of the future would be able to sense the emotions of the passengers with biometric sensors.
"Emotions really affect every aspect of our lives,” he said. “Yet when we think of AI, we think of it as sterile. But what if technology could identify your emotional state? That is what Affectiva has been dealing with. We call it ‘automotive AI’”.
Turcot called it the first multi-modal in-cabin sensing AI that identifies, from face and voice, complex and nuanced emotional and cognitive states of drivers and passengers.
If passengers are angry, the car can sense it. If they are sleepy, the car can ask if it should take over the wheel. Tucot painted a not-so-brave new world for the emotionally untethered driverless car.
Yet Affectiva’s emotion-sensing technology has been on the market since 2011 and is already being used by 1,400 brands to evaluate customer responses to their products.
Finally, Marcus Kuehne, strategy lead immersive technologies at Audi AG, took to the stage to talk about the entertainment possibilities we could enjoy in our autonomous cars. “In the western hemisphere, every driver spends approximately one hour in the car,” he said. “We need to consider how this can be turned into entertainment. And not just movies or music. Consider how large the gaming industry has become and you can see a whole new way we can spend our time while something else drives our car.”
We’ve come a long way, Kuehne reminded us. Not long ago, we were content to have a mono radio in our cars. Then came tape, and maybe a sophisticated sound system.
“Now some cars have connectivity that can rival a home theater,” he said. “There is voice recognition, heads up displays, and a lot more to come.”
He referred to this as the first steps toward AI.
“In 10 to 20 years we will have autonomous cars with a high potential for immersive entertainment experiences,” Kuehne said. “Panasonic and VW are already moving toward it. What will come next is only limited by our imagination. After all, we have at least an hour a day to invest in it.”
After the panel, I asked Kuehne if he could expand on how long this is all going to take for the readers of the SMPTE newsletter and The Broadcast Bridge.
“Today drivers are focused on driving,” he said, “but imagine what else we could do with that time. It all depends on big players like Google and Facebook coming into the field with immersive technologies. But they will have to be able to compete with the revenue chain that is today occupied by smart devices like cell phones or iPads.”
You might also like...
From theory to implementation, this second year of the IP Showcase and Theater at IBC2018 is should be on everyone’s schedule.
Android TV is finally being adopted on a large scale by pay TV operators three years after its launch and seven years on from the original unveiling of its predecessor Google TV. One casualty could be the RDK (Reference Design…
Whether the exhibits and technology would represent more hype than promise was a key question going into NAB 2018. Attendees expected developments on ATSC 3.0 and the industry’s migration toward IP infrastructures. Perhaps most surprising was the high level of interest i…
Following numerous private conversations and panel discussions at the recent 2018 NAB Show, it’s become clear that broadcasters are being challenged like never before to hold the line on CapEx spending while delivering more content across their linear platforms. Because o…
Consumers in the digital age are quick to adapt new media consumption habits as new media and methods of accessing it and interacting with it evolve. Every media technology must innovate and compete or become obsolete. For broadcast television, this…