Let's just name upcoming Apple's automotive-grade applications processor, “C1.” Let me suggest a C1 design template, a tier one for Apple's iCar’s digital cockpit, and clues CES2021 has left us with···
In this article I shall join the dots to suggest why Apple will develop an automotive-grade applications processor, which I have named the “C1.” I am going to suggest a design template for the C1, propose a tier one to develop the iCar’s digital cockpit and explain why CES 2021 just showed us that eye-gaze tracking will be a must-have feature in cars by the end of the decade.
To be clear, this is an opinion piece and I have no evidence that Apple is actively developing an iCar or the C1, or is working with some or any of the companies mentioned in this article. Speculation can only take us so far, but let us start with what we know from elsewhere in the industry.
Google partnered with Qualcomm five years ago to bring the Android operating system to the car. Qualcomm has also partnered with Seeing Machines to integrate automotive-grade eye-gaze tracking into its digital cockpit solutions running on the Snapdragon automotive platform.
I’m reasonably certain that Qualcomm has also partnered with Amazon to develop Alexa Auto, while Cerence is set to launch “Look,” combining natural language processing, gesture recognition and eye-gaze tracking, in the BMW iX and Mercedes-Benz EQS later this year.
That’s Amazon, Cerence, Google and Qualcomm. We have just learned that Nvidia worked with Mercedes-Benz on MBUX Hyperscreen and last year it was announced that Nvidia partnered with Hyundai for infotainment. To this I would add speculation that Microsoft either has plans to release Cortana Automotive, or has plans to buy Cerence.
Why the surge in interest from the tech industry? Because as CES 2021 just revealed, in-cabin AI is suddenly the hottest trend in automotive, with the interface between driver and vehicle set to undergo a revolutionary change by mid-decade. Out goes navigation systems impossible to navigate with click-wheels, haptic controls and multi-layer menus. In comes the immersive user experience powered by eye-gaze tracking, voice assistants and 5G cloud connectivity.
This trend isn’t hard to see, you just have to open your eyes to eye tracking. With so much innovation going on, do you seriously believe Apple will just meekly stand back and watch its biggest technology rivals dominate the automotive experience? C’mon, man.
Why re-invent the wheel?
Apple is joining the automotive party because it has to and if Apple is going to develop an automotive-grade C1 processor, it requires foundry partners with automotive-qualified processes. According to this report from Anandtech dated October 2019, both Samsung and TSMC have achieved automotive qualification. Specifically, for TSMC:
Looking towards the near future, TSMC has been developing an automotive-grade version of its N7 (1st Generation 7nm) technology for quite a while, and expects it to be qualified by 2020.
Samsung has developed the Exynos Auto V9 SoC on its 8nm process, so I will speculate that the Apple C1 will be fabricated using TSMC’s 7nm automotive-grade process. I don’t think we have to try very hard to find a suitable design template, since I’m going to propose that the C1 might look a lot like the Apple A12 SoC, already fabricated on a 7nm process.
If Tesla can design an entire Full Self-Driving (FSD) chip, I think Apple can automotive-qualify an existing applications processor without any major challenges. According to WikiChip, the Tesla FSD chip has 6 billion transistors, a power consumption of 36W and is manufactured by Samsung on a 14nm process. In comparison the Apple A12 has 6.9 billion transistors and a power consumption of about 3.5W. I’ll leave the debate over comparisons and specifications to the experts on Twitter.
I’m certain Apple would make some tweaks, changes and additions to keep us all guessing, but if the iCar is to enter production in 2024, then a lightly modified variant of the A12 looks like a great starting point for the C1. After all, why re-invent the wheel?
Although I don’t think Samsung will be involved in the design or manufacture of the C1 processor, I do think it will be fundamental to the development of the iCar’s digital interior. My top pick for tier one is Harman, which since 2016 has been a Samsung company. We can see more of its vision for the digital cockpit, and the emergence of in-cabin AI, here:
Harman is not exactly my top pick for tier one. It is my only pick. Recall that buzz and excitement the instant you first saw iPhone? If Tim Cook wants to recreate that “Wow” moment at the iCar reveal, his first, last, and only call is to Harman.
Eye-gaze tracking and human factors
As reported by Junko Yoshida, CES 2021 has clarified an emerging theme among automakers: first comes in-cabin AI, followed by software-defined cars. It is the coming together of these two trends that will not only revolutionize the interface between car and driver, but also change the focus of the entire automotive and tech industries from self-driving to immersive user experiences.
This video is the Mercedes-Benz CES 2021 keynote introducing MBUX Hyperscreen. At about 5:20 in we also see the first demonstration of Cerence Look, currently called “Mercedes Travel Knowledge.” Hey, Mercedes, you need a better name than that.
As Mercedes-Benz chief Ola Källenius claimed during the press conference, “This is a user interface that does not distract the driver.” C’mon, man, the new 141-centimeter screen (about 4.5 feet) extends pillar-to-pillar across the entire cabin. How could it not distract the driver?
It seems like it’s only a matter of time before such technologies enter mainstream production, at which point we’ll just have to hope that drivers won’t experience any sort of information overload from a visual standpoint.
We don’t have to “hope” at all that in-cabin AI, pillar-to-pillar screens and AR-HUDs won’t result in any sort of driver information overload, because understanding and managing that issue is exactly the role of combining precision eye-gaze tracking with human factors science and behavioral research.
Human factors science is already well understood in the aviation industry, where for decades the design concept has been to ensure that flight automation neither overwhelms pilots with information, nor compromises their judgment. This is especially true when pilots are confused by their instruments or have lost situational awareness, as evidenced by the tragic loss of Air France Flight 447.
Responsible automakers will therefore combine in-cabin AI and large, complex, displays with precision eye-gaze tracking featuring AI algorithms that measure driver workload based on real-time observations of eye movement and glance patterns. My research suggests that Seeing Machines has a considerable lead over all other driver monitoring suppliers in this area.
Why re-invent the wheel all over again?
We know Apple’s chip development philosophy: It designs what it needs and licenses the rest. Its history of licensing Arm CPU cores teaches us that, first with Arm-based processors from Samsung in the original iPhone and 3G, then using standard Arm cores such as the Cortex-A8 starting with the A4 applications processor, and later by developing custom Arm cores as an Arm architectural licensee. The Apple Ax SoC evolution is summarized neatly in this slide from WikiChip.
My assumption is that Apple will integrate automotive-grade eye-gaze tracking into the C1 processor for iCar by licensing the Occula NPU core from Seeing Machines. Occula makes Seeing Machines into an IP play for driver monitoring, adopting essentially the same business model that Arm used for CPU licensing more than twenty years ago and that was so successful in establishing the partnership with Apple.
A combination of extensively trained and validated AI and computer vision algorithms; human factors expertise with a data set comprising almost 6 billion kilometers of naturalistic driving data; IR optical path expertise operating at 940nm, leading to stable, robust, reliable signals; and a 3-pillar embedded processing strategy all come together to enable Apple – or any tech company – to support AI-driven empathy for an “immersive user experience”. Tim Cook’s first, last, and only call for driver monitoring in the iCar will therefore be to Seeing Machines.
We probably won’t know any details of the Apple C1 processor and what the iCar’s digital cockpit will look like for another couple of years. But as for details of Amazon’s, Google’s and Qualcomm’s plans for in-cabin AI, I suspect we only have to wait another couple of weeks. Qualcomm’s virtual technology showcase, called “Automotive Redefined,” is coming on January 26-27.
I’m expecting to get a glimpse of the future as battlelines are drawn between Apple, Nvidia and Qualcomm for a roadmap to AI-driven empathy and in-cabin AI. These trends are so fresh and exciting that very soon, I’m not sure anyone will much care about more promises of “consumer AVs” – but that there is a whole other article.