Amidst the mania of self-driving, did you dismiss the role of eye-gaze tracking and driver monitoring systems (DMS) in cars?
Be honest with yourself, when Steve Jobs first introduced the iPhone in 2007, did you truly understand what a seminal moment that was? No, me neither, and I’m a tech analyst.
How about when Steve announced the launch of the App Store the following year? Yep, I’m zero for two as well.
Together the iPhone and App Store revolutionized pretty much everything in the tech industry and I’m not convinced even Steve knew how much his vision would alter the nature of business and commerce. What we witnessed was software developers rapidly coming to understand the possibilities of the technology and then letting their imagination, creativity and innovation run wild.
We can broadly summarize the development of apps into three phases:
Now ask yourself this: Amidst the mania of self-driving, did you dismiss the role of eye-gaze tracking and driver monitoring systems (DMS) in cars? Did you overlook DMS tech, just as you might at first have overlooked the iPhone and App Store? With Level 4 AVs on the home stretch and all those CES keynotes telling us about TOPS (Tera operations per second), AI and neural nets, did you think “why even bother with DMS?”
Then read on and I will take you through some developments pointing towards a trend that’s so easy to see, you’ll wonder why you never saw it coming: Eye-gaze controlled automotive infotainment.
Automotive immersive experience
You may not know the name Nuance, but you will almost certainly have heard of its Dragon speech recognition software. You are even less likely to know the name Cerence, which was spun-off from Nuance in October 2019, and is working on some very interesting automotive technology:
Cerence delivers immersive experiences that make people feel happier, safer, more informed, and more entertained in their cars. Bringing together voice, touch, gesture, emotion, and gaze innovations, it creates deeper connections between drivers, their cars and the digital world around them. Cerence currently powers A.I. in more than 280 million cars on the road globally across more than 70 languages and for nearly every major automaker in the world.
Let’s watch this video to see what that immersive experience looks like.
Cerence describes “multimodal interaction combining gaze detection and speech”, which aligns with my research suggesting the coming together of natural voice, gesture recognition and eye-gaze control in the 2021 BMW i4.
As part of Nuance, Cerence has a history of working with BMW, while Seeing Machines exhibited with BMW at CES in January on the i Interaction EASE demo, so we can see who the technology providers for the speech recognition and gaze detection in the i4 are most likely to be.
This video provides more details of the CES demo, and the technology is evidently much closer to series-production than BMW was admitting to at the start of the year. (We have this set to view the pertinent segment starting about 6:20 in.)
Eye-gaze controlled Android Automotive OS?
With almost everyone focused on the development of AI for self-driving, few noticed its suitability for enhancing the driving experience in human-driven vehicles. Not so Qualcomm, which under the noses of Nvidia and Intel/Mobileye, extended its collaboration with Google into the automotive sector for the development of next generation in-vehicle infotainment systems.
The first vehicle featuring Google’s Android Automotive OS, the Polestar 2, only rolled off the production line earlier this year. However, Google has already announced partnerships with other automakers for Android Automotive OS, including FCA, GM, Groupe PSA, Renault-Nissan-Mitsubishi Alliance, and Volvo. Tier 1 suppliers already signed-up include Aptiv, Harman (a Samsung company) and Panasonic.
In August, Qualcomm announced a partnership with Veoneer for ADAS that encompasses eye tracking technology provided by Seeing Machines. Qualcomm and Seeing Machines have also recently announced further details of their partnership in the form of a development kit for infotainment systems.
Putting that information together with the demonstrations from BMW and Cerence, I conclude that Qualcomm and Google are developing a future generation of Android Automotive OS that integrates eye-gaze control using technology provided by Seeing Machines, possibly for series-production starting in 2023. We can see more of how the basic functionality might work in this video.
Cerence lists its automotive customers as “Audi, BMW, Daimler, Ford, Geely, GM, SAIC, Toyota, and many more”, which are many of the major automakers that have not already announced plans to adopt Android Automotive OS. Therefore, in the years ahead it looks highly likely that every major automaker will introduce eye-gaze controlled user interfaces to some or all of their vehicles.
Just like the App Store, the benefits of eye-gaze tracking in automotive may not have been immediately apparent to many. While Intel and Nvidia were focused on developing high-performance processors and AI for a self-driving future, Cerence, Google, Qualcomm and Seeing Machines appear to have been busy developing technology that will revolutionize the in-cabin experience and substantially reduce the distraction from smartphones.
I’m left wondering if Alphabet — parent company of Waymo and Google — isn’t just about the smartest company out there. It appears to have hedged the outcome of the self-driving revolution across Waymo and Android Automotive OS and in so doing could have outmaneuvered its fiercest rivals at Apple, Amazon and Microsoft with a strategy no one saw coming.
But Qualcomm, for so long living in the shadow of Intel and Nvidia in automotive processors, is perhaps the company that will laugh loudest and longest. With a partnership with Veoneer in ADAS and another with Google in infotainment, it shows once more that the company making the least noise is almost always the one making the most progress.