Mobileye Unveils EyeQ Ultra SoC for Autonomous Driving

Article By : Stefani Munoz

Mobileye's latest SoC for autonomous driving, EyeQ Ultra, touts a maximum performance and efficiency at 176 TOPS.

Mobileye announced its EyeQ Ultra system–on–chip at CES, it’s most advanced EyeQ version to date according to the company. Described as a single package AV–on–chip supercomputer, Mobileye’s EyeQ Ultra driving touts a maximum performance and efficiency at 176 TOPS.

Built on its seventh-generation EyeQ architecture, and fabbed in a 5nm processing technology, Mobileye claims its EyeQ Ultra SoC can offer performance equivalent to 10 of its EyeQ5s in a single package. EyeQ Ultra will also be able to handle fully autonomous Level 4 driving as defined by the Society of Automotive Engineers (SAE), meaning vehicles equipped with EyeQ Ultra will require no human intervention when automated driving features are engaged — though the option for manual input is still available.

SAE J3016 levels of driving automation (Source: SAE) (Click image to enlarge)

To fully realize its autonomous driving vision, Mobileye’s EyeQ Ultra relies on four classes of proprietary accelerators: XXN, PMA, VMP and MPC. XXN is a dedicated AI engine for deep learning neural networks, PMA is a programmable CGRA, VMP is a SIMD VLIW machine and MPC are multiple barrel–threaded CPU cores.

By combining these four accelerators with additional CPU cores, as well as ISPs and GPUs for visualization, the EyeQ Ultra is capable of “processing input from two sensing subsystems — one camera–only system and the other radar and lidar combined — as well as the vehicle’s central computing system, the high–definition map and driving policy software, to perform highly specific, individual tasks,” according to Mobileye.

It’s interesting to note Mobileye’s decision to support a one-camera only system with an alternative radar and lidar combined sensor. Tesla is notorious for solely relying on cameras. Other automotive companies, however, are debating whether they should opt for two sensor types, such as camera and radar or camera and lidar, or even in some cases three sensor types, including camera, radar and lidar.

A spokesman from Mobileye told EE Times that the decision to support a one-camera only system stems from a necessity for redundant systems.

“Mobileye’s approach is to build redundant systems. A camera–only system can stand on its own and a lidar [and] radar system can stand on its own. Each subsystem can support a full operation design. By combining the two separate perception sub–systems we obtain a more robust overall system,” Mobileye’s spokesman said. “The imaging–radars that we are developing would take most of the field of view occupied by todays lidars and leave only the front–facing sector for a lidar to work in a redundant manner with cameras and radar. This will provide a three–way redundancy in the front sector and an overall major cost reduction for the entire sensor configuration as imaging radars are one-fifth to one-tenth the cost of a lidar.”

In tandem with its EyeQ Ultra announcement, Mobileye also unveiled two new EyeQ SoCs specifically designed for ADAS: EyeQ6L and EyeQ6H. The EyeQ6L is the successor to its EyeQ4 chip and is 55 percent of the size of the EyeQ4, offering advanced deep learning TOPS with ultra–low power to support L2 ADAS, according to Mobileye. The company began sampling the EyeQ6L last year and plans to production by middle 2023.

Mobileye’s EyeQ6L SoC for ADAS (Source: Mobileye) (Click image to enlarge)

The EyeQ6H touts computing power equivalent to two of Mobileye’s EyeQ5 SoCs to better support virtualization and AI-intensive workloads. Specifically designed with premium ADAS or partial AV capabilities with full surround in mind, the EyeQ6H will fully support ADAS L2 functionalities, offer multi–camera processing as well as host third–party applications such as parking visualization and driver monitoring.

Given the joint announcement, some may wonder whether EyeQ6L and EyeQ6H can create a path to using Ultra. A Mobileye spokesman said this isn’t necessarily true.

“We see EyeQ Ultra as a deviation from our normal EyeQ evolutions, as this is the first EyeQ to be designed especially for AV, hence the name “Ultra” and not EyeQ7.”

The company says we can expect full-automotive grade production of EyeQ Ultra in 2025. For the time being, EyeQ Ultra’s first silicon is slated for late 2023. But how will the first silicon release compare to the full-automotive grade parts in 2025? Though the first silicon isn’t automotive grade, customers won’t be able to tell much of difference in terms of functionality.

“The first silicon released in 2023 will be fully capable of all self-driving functions, ready to begin hitting the road in smaller volumes in 2024. This cut is not automotive grade but will not differ from the production version in functionality,” Mobileye’s spokesman said. “The 2025 silicon will have been put through the automotive qualification process ensuring it can be deployed in volume. This timeline follows the typical automotive SoC production cycle, which begins design validation with engineering samples and proceeds with production parts for product validation.”

Depending on the vehicle production schedules of Mobileye’s customers, we could see adoption of EyeQ Ultra in commercial vehicles as early as 2025, according to Mobileye.

This article was originally published on EE Times.

Stefani Munoz is associate editor of EE Times. Prior to joining EE Times, Stefani was an editor for TechTarget and covered a host of topics around IT virtualization trends and VMware technologies.

 

Virtual Event - PowerUP Asia 2024 is coming (May 21-23, 2024)

Power Semiconductor Innovations Toward Green Goals, Decarbonization and Sustainability

Day 1: GaN and SiC Semiconductors

Day 2: Power Semiconductors in Low- and High-Power Applications

Day 3: Power Semiconductor Packaging Technologies and Renewable Energy

Register to watch 30+ conference speeches and visit booths, download technical whitepapers.

Subscribe to Newsletter

Leave a comment