Teaching a car to anticipate a pedestrian suddenly stepping onto the roadway remains the hardest thing to do.
DeepScale argues AI chips and sensor systems should come with DNNs optimized for the application. It will fundamentally change how companies buy AI technology.
When a panel of experts at SAE China's AV conference were asked "which data or lessons are you willing to share with other automakers?" The question induced a long uncomfortable pause.
Assume that AV companies have already collected petabytes, or even exabytes of data on real roads. How much of that dataset has been labeled? How accurate is it?
At this year's Frankfurt Motor Show, a new trend has emerged: A transition from C.A.S.E. (Connected, Autonomous, Shared, Electric) to C.A.P.E. (Connected, Assisted, Personalized, Electric).
Sensors encompass a significant part of the technology that navigates an autonomous vehicle. Beyond them, HD maps will be substantial, too.
The automotive industry is still in a hunt for "robust perception" that works under any conditions - including night, fog, rain, snow, black ice, etc. The view from AutoSens 2019.
NXP launched a deep learning toolkit called eIQ Auto. NXP is seeking to set itself apart from competitors by making its tool "automotive-quality." NXP's goal is to make it easier for AV designers to implement deep learning in vehicles.
Humans drivers are expected to be mature enough to anticipate what might happen on the road next. What expectations should we have of drivers who are not human?
Bosch, Continental, Denso, GM, Nvidia, NXP, and Toyota declared their support for the new consortium.
Are autonomy features like Autopilot and Smart Summon "wolves in sheep's clothing"? Tesla is pushing Level 4 autonomy into the body of Level 2 vehicles, while disclaiming responsibility.