Awareness of the importance of robust driver monitoring systems has risen significantly in recent weeks. I will explore some of the most common misconceptions I've seen written about this vital safety technology.
Awareness of the importance of robust driver monitoring systems (DMS) has risen significantly in recent weeks as a result of the high-profile deaths near Houston, Texas, of two occupants traveling in a Tesla and the reintroduction of the Stay Aware for Everyone (SAFE) Act in Congress. In this article I explore some of the most common misconceptions I have seen written about this vital safety technology.
Rightly or wrongly, many people still believe we are in the home stretch to Level 4 autonomy and that a “full self-driving” reality is but one over-the-air update away, thus rendering DMS obsolete. Let’s review the evidence.
As this article discusses, Waymo has made slow progress expanding its driverless service in the last two years and EETimes recently exposed the inconvenient truth that Waymo’s driverless cars rely on DMS.
Perhaps most revealing is the observation that “Tesla privately admits Elon Musk has been exaggerating about ‘full self-driving’” while Tesla has been working to repurpose the interior camera in the Model 3 and Model Y to provide elements of driver and occupant monitoring. Elon Musk says many entertaining things, but Tesla’s actions confirm the importance of DMS in the years ahead.
Blinded by sunlight
The most common criticism of the DMS in GM’s Super Cruise is that it can be blinded by direct sunlight and critics often use this fact to dismiss the role of DMS altogether.
Super Cruise was so far ahead of its time that it had to be developed using 850nm IR optical components, since automotive-qualified 940nm components were not available at the design stage. As this graph below shows, there’s substantially less energy in sunlight at 940nm than 850nm, so changing the operating wavelength of the optical path to 940nm mostly eradicates the blinding issue.
Super Cruise debuted in the 2018 Cadillac CT6 (launched in 2017) but wasn’t rolled out to further models until this year, when it was added to the CT4, CT5 and Escalade. Why the delay? My assumption is that GM elected to make a number of modifications to the design of Super Cruise, including upgrading the optical path to 940nm to resolve the sunlight issue.
Doesn’t work with sunglasses
IR light passes unimpeded through the lenses of most sunglasses, including polarized sunglasses. To the DMS image sensor, the lenses appear transparent meaning the eyes, eyelids, blinks and eye-gaze vector can be tracked as normal. It isn’t the lens material itself which scatters IR light, but the pigment used to provide the tint. Sunglass manufacturers are likely to consider this issue more carefully in future, perhaps with a “works with DMS” or similar label.
A myth I have seen discussed frequently on social media is that DMS is defeated by positioning a photograph of an alert and attentive driver in-front of the image sensor. This is a technique known as “spoofing.”
Photographs are static images and the head pose, eyelid opening and eye-gaze vectors don’t change, meaning spoofing is detected instantly. If you instead seek to defeat a DMS using a video feed, know that that won’t work either since the reflection of IR light from an electronic display differs from the reflection of IR light from the back of the human eye.
I have spent some time specifically trying to defeat a sophisticated robust DMS. The best I could do was to momentarily stop it tracking my eyes, but that involved obscuring my face with both hands, squinting and holding my neck in an uncomfortable position — and that was all back in 2019 and developments have since moved on. There’s over twenty years of R&D in this technology and from experience, I conclude that robust, safety-grade driver monitoring can’t be easily defeated.
Can’t detect a phone in driver’s eyeline
Another common myth on social media is that DMS can’t detect a driver attaching their phone onto the windshield directly above the steering wheel to watch a movie while seemingly looking at the road ahead.
This is essentially the set-up of a head-up-display (HUD), so let’s take a brief detour to look at how state-of-the-art eye tracking is already being used in full flight simulators (FFS) for pilot training, to gain insight into how driver monitoring is set to advance much further in the automotive sector.
This slide visualizes the real-time gaze point of the pilot (pink dot) along with two multi-colored bars showing, at the top, a dwell-time indication of where the pilot’s attention has been over a fixed period, and below a real-time measure of where the pilot’s attention is on a rolling indication from right (current) to left (recent). We can see the pilot’s attention focused on the flight path indicator (yellow) and aircraft speed (green), which is as expected for a take-off maneuver.
The key issue is the reliability and fidelity of the gaze tracking which — and returning now to the automotive sector — combined with human factors science and behavioral research can easily differentiate the head movements and eye-gaze scan patterns of an attentive, engaged and situationally-aware driver from those of a person sitting in the driver’s seat watching a movie on a strategically positioned phone.
Experience teaches humans subtle social skills, such as to recognize interactions with a person who is distracted or disinterested based on their glance patterns and a “glazed over” look. The combination of human factors science and behavioral research, machine learning and naturalistic driving data results in similarly capable AI-based DMS which interprets patterns of head pose, blinks and eye-gaze to classify the driver’s attention state and engagement level.
State-of-the-art DMS can already provide a real-time analysis not just of whether the driver has their eyes on the road, but also if their mind is on the task of driving. Current research is investigating how alcohol impairment affects an individual’s oculomotor control (by decreasing the velocity and accuracy of glance behaviors and increasing the number and duration of fixations), suggesting that DMS could soon measure alcohol-induced impairment.
Drivers must have confidence that a camera installed in their vehicle will not invade privacy, for example by capturing raw video footage for upload to the cloud, by providing evidence of erratic driving to their insurer, or ratting them out to the cops if they drive drunk. Spying and surveillance is not how you build trust in the live-saving potential of DMS technology.
Balancing privacy with advances in road safety is a delicate act which necessitates the involvement of lawmakers. In Europe the mandate for the adoption of DMS stipulates that only closed-loop systems can be installed, which must perform all video analytics and processing inside the vehicle. Specifically, Regulation (EU) 2019/2144 for the type-approval requirements for motor vehicles reads:
Any such safety system should function without use of any kind of biometric information of drivers or passengers, including facial recognition.
In the U.S., the wording of the SAFE Act reads:
(3) PRIVACY.—The rule issued under paragraph (1) shall incorporate appropriate privacy and data security safeguards, as determined by the Secretary of Transportation.
Is a nag
Combining signals for visual distraction, cognitive distraction, drowsiness and impairment provides the vehicle with an assessment of the driver’s engagement level and attention state. With this information the sensitivity and responsiveness of the other safety systems can be modified in real-time.
The result? The all-too-common nags for lane departure warning and automatic emergency braking alert are reduced. Thus, if you pay more attention to the road and less attention to your phone, the addition of robust driver monitoring technology which is fully integrated with the driver assist systems results in fewer nags.
At the beginning of May, SAE International once again updated its levels of driving automation, as described in standard J3016. I see two distinct roles for technology to improve safety on public roads: To make human drivers into safer drivers, and to replace humans as drivers.
J3016 has been misappropriated by numerous tech startups and disruptors into a “race to Level 5” resulting in a deeply embedded narrative that “high levels” of driving automation are high tech, and “low levels” of driving automation are low tech.
Intentionally or not, this narrative is reinforced by J3016, which assigns safety-critical technology used to provide short-duration warning, assistance and intervention (such as automatic emergency braking, blind spot warning, lane departure warning and driver monitoring) to SAE Level 0, as shown below.
Human drivers are susceptible to distraction, drowsiness and impairment irrespective of the level of automation offered by the car they are driving, which is why I am supportive of Euro NCAP’s testing program to make safety-critical technology standard on all vehicles, not just those operating a hands-free highway-assist feature at Level 2.
Qualcomm looks to be following a similar approach based on its Automotive Redefined technology showcase held in January, which showed DMS as standard in its fourth-generation cockpit applications processor and Snapdragon Ride platform, in both cases integrating driver monitoring software from Seeing Machines.
DMS is easy
With all of the attention over the last five years focused on companies chasing “full self-driving” and full autonomy, developments in lidar and attention-grabbing announcements of “1000-TOPS processors” little has been said about the importance of DMS as a safety-critical system, but a lot has been said that it is obsolete, easy to defeat, invades privacy, is nagging and low tech.
So, the final myth is that DMS is easy. This assertion is technically correct since DMS requires little more than a camera, processor and machine learning to get started. In comparison automotive-grade driver monitoring is very hard, which is why there is only a smattering of suppliers capable of meeting the needs of the global automotive industry.
Automotive-grade DMS suppliers include Cipia, Jungo, Smart Eye and Xperi, but my assessment suggests that the clear technology leader is Seeing Machines, which has a focus not only on automotive, but other transportation industries including aviation, rail and trucking and off-road sectors such as agriculture, construction and mining.
As the fifth anniversary of the first Autopilot-related death passes, it is worth reflecting on the status of driver monitoring adoption in the United States. As of the first quarter of 2021, all three of the major Detroit automakers (Ford, GM and Stellantis) have vehicles in production featuring robust driver monitoring and it is Tesla, for so long the technology posterchild with Autopilot, which is now the DMS technology laggard.
This article was originally published on EE Times.
Colin Barnden is principal analyst at Semicast Research and has over 25 years of experience as an industry analyst. He is considered a world expert on market trends for automotive vision-based driver monitoring systems (DMS). He holds a B.Eng. (Hons) in Electrical & Electronic Engineering from Aston University in England and has covered the automotive electronics market since 1999.