‘Self-driving is Hard’?! No Duh, Elon

Article By : Colin Barnden

Everyone else already noticed that automotive traffic defines complexity and that self-driving is a difficult thing to achieve.

An admission that self-driving is hard has now been made by none other than Tesla CEO Elon Musk. “Didn’t expect it to be so hard, but the difficulty is obvious in retrospect,” Musk wrote recently on Twitter, adding “Nothing has more degrees of freedom than reality.”

In light of Musk’s mea culpa, in this piece I return to the subject of complexity theory, illustrate the wonders of complexity science in nature, and highlight some of the challenges complexity poses to the development of self-driving technology using machine learning.

So, let’s forget about tech and turn to a video of something much more interesting: Starlings. I invite you to take a whole five minutes from your schedule to watch this video, if for no other reason than it reminds us of the breathtaking beauty of the world around us. The video is of a flock of starlings in flight, known as a murmuration.

Look carefully and you will observe continuous changes in the path of the murmuration, as well as in both its shape and density. To paraphrase Musk, “Nothing has more degrees of freedom than starlings in flight.” There is no central brain “controlling” the murmuration and the science of what is shown is explained in this blog post.:

Focusing on the birds’ ability to manage uncertainty while also maintaining consensus, they [*Young et al., 2013] discovered that birds accomplish this (with the least effort) when each bird attends to seven neighbors.

Broadly speaking, each starling is following the same “rule of seven” but, as the video demonstrates, that does not result in certainty. Neither does it result in chaos; it results in complexity. Writing in Simply Complexity, author Neil Johnson observes:

Complexity Science can be seen as the study of phenomena which emerge from a collection of interacting objects … which are competing for some kind of limited resource.

Which is a precise description of public roads, especially dense urban environments in rush-hour, and the target market for self-driving developers such as Cruise, Tesla and Waymo. “What do complexity science and starlings have to do with machine learning and self-driving,” you ask? Read on.

Judgment under uncertainty

Common sense tells us we could watch every video of starlings in existence, but the information would not then enable us to make accurate predictions regarding the path of a new murmuration. Our lived experience teaches us the past does not provide certainty of the future.

Complexity science and starlings therefore inform us of the futility of using brute force road testing to train probabilistic AI for self-driving. Billions of miles of training data, vast arrays of processors and sophisticated sensor suites will not lead to prediction and path-planning algorithms which can reliably navigate the uncertainty inherent in complex road systems, much less with even greater reliability than the judgment shown by an alert and engaged human driver.

While Elon Musk is only just starting to publicly accept the difficulties of self-driving, no one has been more vocal of the challenges to be overcome than Missy Cummings, professor of Electrical and Computer Engineering at Duke University.

In a paper entitled “Rethinking the maturity of artificial intelligence in safety-critical settings,” Cummings explains how human drivers use a combination of both “bottom-up” and “top-down” reasoning to navigate uncertainty, as shown in the figure below.

Tesla’s Autopilot and “Full Self-Driving” software demonstrate moderate skill at controlling the longitudinal and lateral position of a vehicle on a roadway and can follow basic rules of the road, such as stop signs and red lights. However multiple fatal crashes inform us of a fundamental lack of knowledge and the expert judgment necessary to navigate the random events inherent in complex systems such as public roads.

Although competent, machine learned AI is still many years, and possibly many decades, away from the realization of self-driving technology that is demonstrably safer than an alert and engaged human driver. You can hear more about this in the podcast “Why Self-Driving Cars Aren’t Coming Any Time Soon with Dr. Missy Cummings.

Cummings has been making the same points for years. Musk didn’t need a mea culpa, he needed to listen.

* Young GF, Scardovi L, Cavagna A, Giardina I, Leonard NE (2013), Starling Flock Networks Manage Uncertainty in Consensus at Low Cost.

This article was originally published on EE Times.

Colin Barnden is principal analyst at Semicast Research and has over 25 years of experience as an industry analyst. He is considered a world expert on market trends for automotive vision-based driver monitoring systems (DMS). He holds a B.Eng. (Hons) in Electrical & Electronic Engineering from Aston University in England and has covered the automotive electronics market since 1999.

Virtual Event - PowerUP Asia 2024 is coming (May 21-23, 2024)

Power Semiconductor Innovations Toward Green Goals, Decarbonization and Sustainability

Day 1: GaN and SiC Semiconductors

Day 2: Power Semiconductors in Low- and High-Power Applications

Day 3: Power Semiconductor Packaging Technologies and Renewable Energy

Register to watch 30+ conference speeches and visit booths, download technical whitepapers.

Subscribe to Newsletter

Leave a comment