Defensive driving is hard for a human intelligence to master. For Tesla's well-meaning but still pretty dumb "Autopilot" to truly learn defensive driving...?
“The investigation was prompted by at least 11 accidents in which Teslas using Autopilot, an assisted-driving system that can steer, accelerate and brake on its own, drove into parked fire trucks, police cars and other emergency vehicles…”
— N.Y. Times, 16 August
My high-school driver’s education teacher, Dave Maas, used to harp endlessly on the importance of “defensive driving.” Unfortunately, he was talking to fifteen-year-olds, which meant most of this life-saving pedagogy went right in one typical teenage ear and out the other. I was something of an exception, because I had been a close observer of both my parents’ driving styles.
Dad was a sort of death-wish James Dean manque. By the time he was 25, he’d broken numerous bones in headlong high-speed plunges into forests, fenceposts, and farm buildings. Mom, on the other hand, trembled in terror every time she took the wheel. She halted twice at every stop sign and veered reflexively to the right at the approach of an oncoming car.
As a faithful pupil of Mr. Maas, I determined, when I got my license, to find a safe medium between Mom and Dad. Defensive driving was my eucharist.
The hitch, of course, is that you can’t effectively learn defensive driving in class. It’s on-the-road education that requires, among other qualities, a few smarts, constant concentration, keen alertness and an intuitive ability to read the “body language” of moving cars, making it possible to anticipate other drivers’ intentions, sudden movements and mistakes. This is an education that requires years of trial-and-error paranoia.
Defensive driving, simply, is hard for a human intelligence to master. Most people never learn how to do it. More important is that it lies beyond the capacity of any existing algorithm. It defies “artificial intelligence.” For Tesla’s well-meaning but still pretty dumb “Autopilot” to truly learn defensive driving — much less distinguish the side of an eighteen-wheeler from a patch of sky — would require it to memorize, compute, collate, compare and strategize billions of disparate, anomalous and downright weird scenarios and “corner cases.”
Trying to create a Dave Maas algorithm — teaching a car to drive defensively — was Teslas fatal error in the development of its so-called “Autopilot” feature. Elon Musk was even dumber when he decided to call it Autopilot, a misnomer that enabled lazy and careless drivers to abandon the wheel and trust a twitchy and astigmatic machine to keep them alive at 75 miles an hour on I-90.
Tesla should have started its Autopilot project by driving, essentially, on the other side of the road, teaching cars “offensive driving” — which is what most people do when they sit down behind the wheel and hit the ignition. Offensive driving is what keeps defensive drivers on permanent red alert. The beauty of offensive driving is that you don’t have to learn anything. The algorithm could be programmed from a third-grade arithmetic text.
Off the top of my head (I was really good at third-grade arithmetic!), I’ve devised three basic offensive-driving programs for the geniuses at Tesla to feed into their next-gen EVs.
In “Geezerpilot” mode, your re-programmed Tesla would behave like an 85-year-old inching out of the driveway — without looking either way — in a 1982 four-door Cadillac DeVille. It would proceed down the road at about 60 percent of the posted speed limit, slowing sometimes down to 10-15 mph, weave unpredictably from side-to-side, threatening the side-view mirrors of parked cars, stop unexpectedly in mid-lane or mid-block and would seem, now and then, to sink into frozen dormancy at stop signs and traffic lights. It would have no capacity to deal with four-way stops and rotaries and, when parking, would do so in the middle of the street.
The “Teenpilot” function would be, in many respects, as erratic and surprising as Geezerpilot, but with a number of noteworthy variations. By design, its road behavior would precisely mimic that of a sixteen-year-old boy or girl, operating on a learner’s permit while either talking, texting or playing Candy Crush or Call of Duty. It would speed up suddenly, slow to a crawl just as unexpectedly, swerve into the oncoming lane before correcting violently, ignore stop signs or slam on the brakes suddenly, scrape parked cars and then drive away in a state of panic, and perform a panic stop in the middle of the street at the sight of a) a cute boy, b) a hot girl or c) a friend approaching in another “Teenpilot” robocar in the opposite lane.
The most exciting option, in my estimation, would be an algorithm that might be called “Bournepilot,” a function during which a Tesla would behave exactly like Jason Bourne in a car chase. Unless prevented from doing so by defensive drivers sharing the road, it would careen down city streets at 80 miles an hour, take every turn on two wheels, leave a pound of rubber on the pavement with acceleration, bounce off parked cars, T-bone cross-traffic, leaving behind a trail of steaming wreckage and car fires, drive in the wrong direction — pell-mell and weaving spectacularly — on every available one-way street, eventually aiming at and plowing into, at top speed, either a bridge abutment or a large body of water.
Of course, a Tesla using Bournepilot would have a pretty short lifespan. But it would be the ultimate thrill ride for its (back seat) driver, and it would become the new standard for the automotive tradition of planned obsolescence. You’d need to buy a new one every week.
Unlike defensive driving, which is uniform among its practitioners, offensive driving — like Tolstoy’s unhappy families — has as many variations as there are bad drivers on the road. So, the range of possible offensive driving algorithms for Tesla’s engineers to devise is literally infinite.
The virtue in all these modes is that it liberates Tesla and all developers of assisted and autonomous vehicles from the pretense of road safety. By programming “autopilot” cars that are indifferently safe — behaving as though they’re being steered by a distracted mother yelling at the kids in the back seat (“Mompilot”) or a corporate hotshot closing a deal on the phone while lighting a cigarette and fondling a secretary in the shotgun seat (“Bosspilot”) — restores a familiar balance to life and death on the highway.
Most drivers and driverless cars will go on behaving stupidly and recklessly, weaving, lurching, slamming the brakes, cutting off other drivers and taking sudden left turns in the face of oncoming traffic. A handful of others — defensive drivers — will assume the responsibility of watching for these idiots and the auto industry’s new wave of idiot algorithms (“Idiotpilot”).
We’ll anticipate their mistakes, outrages and incipient crimes against humanity. We’ll react accordingly, prevent a million seemingly inevitable accidents and we will — as always — save these careless, distracted morons, and their myopic machines, from themselves.
With no thanks from Elon Musk.
This article was originally published on EE Times.
David Benjamin is an an author, essayist and veteran journalist who has been writing regularly for EE Times for more than 20 years. His novels and non-fiction books, mostly published by Last Kid Books, have won more than a dozen literary awards. His latest novel is Fat Vinny’s Forbidden Love.