LAS VEGAS — After a barrage of showbiz glitter — lights, music, drones, and AI demos — flooded the theater, Intel Corp. CEO Brian Krzanich took to the stage on Monday night to introduce his company’s automated vehicle (AV) platform, delivering the most anticipated speech at this year’s Consumer Electronics Show (CES).
Sharing the stage with Krzanich was the first among a 100-car fleet of Level 4 vehicles promised by Mobileye last summer.
Observers who were anticipating Intel’s usual bravado, including boasts that AV processor’s TOPS and watts are bigger and better than others, departed disappointed.
The CPU giant modestly made the case for a “camera-first” autonomous vehicle strategy, a plan that exploits the proliferation of camera-based ADAS systems to build a “Roadbook” that helps robocars drive safely. Significantly, Intel did not assert that development of AVs should rely on a complex of sensor-fusion and brute-force processing.
Intel’s strategy painted a picture of the AV industry — although still embryonic — already split into two factions. Intel/Mobileye has embarked on a longer, evolutionary path versus the rest of an industry — Waymo, Uber, and others — impatient to jump into the ride-hailing/fleet business with their own robocars as soon as possible.
Phil Magney, founder and principal at VSI Labs, called Intel’s approach “very pragmatic,” when considering the incremental evolution of automated features.
Intel’s AV reference design, unveiled during the keynote and scheduled for rollout this year, is stunningly simple as the company puts a vision-first philosophy at the core of its AV strategy. The Intel/Mobileye platform consists of two Mobileye EyeQ5 chips (one for vision processing, the other for fusion/planning) and Intel’s 8-core Atom processor, previously known as Denverton. No FPGA, no Intel Neverna’s Neural Network Processors, no other surprise AI chips are included in the platform.
Unlike competitors who see other sensors such as LiDAR and radar as redundant elements of object detection, Intel insists that the camera is the only real-time sensor that robocars can reliably depend on for driving path geometry and other static scene semantics. Cameras feed robocars the highest performance data at the lowest cost, says Intel.
This “vision-first” approach pits Intel/Mobileye against Alphabet/Waymo, whose robocar relies on the high-level AI-based sensor fusion of HD camera, radar, and lidar for localization.
Next page: Focus on ADAS — L2 cars