The lessons that Audi learned from building the world's first Level 3 autonomous vehicle, the A8, remain pertinent today. Here's what our teardown of the A8 with System Plus revealed.
A recent teardown of the Audi A8 reveals exactly why, from both a technological and economic standpoint, achieving higher levels of autonomy for vehicles has been harder than anyone originally expected. Audi’s experience with the A8 consequently remains relevant today.
When Audi launched its redesigned A8 sedan at the end of 2017, the company touted it as the auto industry’s first Level 3 car. The entire automotive industry is still contending with technological issues and unfamiliar cost structures that confronted Audi back then. The teardown conducted by System Plus provides valuable insights into a few questions:
It can be instructive how Audi achieved Level 3 functionality using chips that had been on the market, and were already tried and tested in other applications, especially in comparison with Tesla, which two years later (2019) launched its “Full Self Driving Computer” board which relies heavily on two home-grown self-driving chips.
System Plus teardowns include analyses that go beyond simply reverse engineering and identifying hardware elements. The firm also performs “reverse costing” — estimating how much it must have cost a company to source specific components and build its products. System Plus’ reverse costing of the A8 shows that 60% of the cost of zFAS — estimated to be $290 — is driven by the cost of semiconductors. This is hardly startling, since 80 to 85 percent of the content in modern cars is electronics. That wasn’t the startling thing about the costs, however.
The real shocker to car OEM, said Romain Fraux, CEO at System Plus Consulting, is that no automotive companies were mentally prepared to pay a 50 percent margin per component — as charged by Nvidia, Intel and others for their flagship chip solutions. This opened the door to a whole new world for automotive OEMs, prompting them to rethink the calculus of highly automated vehicles.
The System Plus teardown/cost estimate does not include the cost of software development for automated vehicles. However, the use of an FPGA (Altera Cyclone) inside zFAS shows Audi’s attempt to preserve the own software assets it had already developed.
Over the last 18 months, some leading OEMs have begun hinting their desire to design their own autonomous vehicle chips, a la Tesla. This approach enables them to control their own destiny in terms of hardware and software development. However, given the high cost of chip designs, it’s far from clear if car OEMs are better off going it alone.
Another important aspect of A8 is that Audi brought to the market the first commercial vehicle, among all the car OEMs, to show a path to autonomy.
At the time of A8 launch, the technology inside the vehicle was pitched as an “automated driving breakthrough,” featuring a system called Traffic Jam Pilot. When activated, Traffic Jam Pilot supposedly relieves human drivers from the ordeal of negotiating stop-and-go traffic.
But these best-laid plans collided with a “handoff problem” (to alert and engage a distracted human when the computer falters) that has dogged the very concept of the Level-3 vehicles since the beginning.
Today, A8s are on the streets, but none has its Level 3 autonomy features activated and operating in the real world.
That’s not a knock on Audi, however. The A8 made it clear to the AV industry what it’s up against. Industry leaders must sort out every regulatory, technical, safety, behavioral, legal and business-related complication before they can tout the utopian future of robocars. This partly explains growing momentum behind safety standards setting among car OEMs, Tier Ones, chip suppliers and technology and service companies (i.e. Waymo, Uber).
AV: Bringing Standards Together for Safety
A8 under the hood
The challenge for automotive manufacturers will no longer be offering the most speed, or the best acceleration from zero to 100 km/h, but to ensure increasingly advanced autonomous driving and assistance systems. This is the goal of the Audi A8 with Level 3 self-drive, the first to used Lidar technology.
The A8’s sensor suite also includes cameras, radar, and ultrasonic sensors. The Audi A8 will autonomously manage driving on the most congested roads without a driver’s intervention. Audi specifies that the driver can keep his or her hands off the steering wheel at all times and, depending on local laws and regulations, can engage in other activities such as watching TV onboard. The vehicle can perform most driving tasks, but human override is still required (figure 1).
Fraux ran down the list the innovative technology inside the Audi A8: “Audi is the first car with Level 3 autonomy. The Traffic Jam Pilot system installed on the Audi A8 takes charge of driving in slow-moving traffic at up 60 km/h on freeways and highways, using sensor fusion and the world’s first laser scanner.” (Note: This Level 3 feature, however, has never been activated to date)
Level 3 autonomy and computing platform
Digital technology can take on the same tasks that the driver should do while providing greater safety and comfort. The long-term goal is to have fully networked roads — an automotive smart grid. Traffic jams and environmental pollution will be reduced, with a remarkable improvement in safety.
Autonomous driving is an increasingly central topic in the automotive world; news about the progress and novelties in the sector are on the agenda. Level 3 used for the Audi A8 is defined as highly automated driving. The system is able to relieve the driver of the need for continuous control over the longitudinal and lateral movement of the vehicle.
Fraux said: “the Audi A8 consists of a variety of sensors and a zFAS controller put together by Aptiv with four processor chips.” zFAS (figure 2) is the first centralized computing platform. A computer as a central hub to process live feeds of ultrasonic sensors (front, rear, and side), 360-degree cameras (front, rear, and side mirrors), mid-range radar (at every angle), as well as a long-range radar and a laser scanner on the front of the vehicle in real-time.
A plethora of processors inside zFAS
The processors that make up the platform are Nvidia Tegra K1 used for traffic signal recognition, pedestrian detection, collision warning, light detection, and lane recognition. Tegra K1 with eight layers of PCBs integrates 192 Cuda cores, the same amount that Nvidia integrates into a single SMX module inside the Kepler GPUs currently on the market (figure 3) with DirectX 11 and OpenGL 4.4 support.
Having a very powerful processor in a car is of great importance when you think of the number of sensors that are integrated into it. Intel/Mobileye’s EyeQ3 is responsible for image processing. To meet power consumption and performance targets, EyeQ SoCs are designed by using a finer geometry. Of Eye3, Mobileye is using a 40 nm CMOS, while the company will be using 7nm FinFET in the 5th generation SoC EyeQ5. Each EyeQ chip features heterogeneous, fully programmable accelerators; with each accelerator type optimized for its own family of algorithms.
Curiously, Nvidia Tegra K1 and Mobileye EyeQ 3 aren’t enough to handle all the ADAS tasks expected for Level 3 vehicles. Inside zFAS are Altera Cyclone for data pre-processing and Infineon Aurix Tricore to watch for safety operations. Altera Cyclone family of FPGA devices is based on a 1.5V, 0.13µm, all-layer copper SRAM process, with densities up to 20,060 logic elements (LEs) and up to 288 kbits of RAM.
Infineon Aurix architecture is implemented for performance optimization in the powertrain and safety applications of the automotive industry. TriCore is the first unified, single-core, 32-bit microcontroller-DSP architecture optimized for real-time embedded systems.
Sensors in Audi A8
In the automotive world, advanced driver assistance systems have become a must for all new cars to score higher Euro NCAP ratings. In the first page, in figure 1, we could find the detailed list of devices in the Audi A8 found by System Plus. “Manufacturers are developing increasingly efficient radar sensors, in the market, we can distinguish some companies such as Aptiv, Veoneer, ZF, Valeo, Bosch, Mando, Denso, and Ainstein,” said Fraux.
In particular, in the Audi A8 we can see an Autoliv’s 3rd Generation Automotive Night Vision Camera, Aptiv’s Lane Assist Front Camera, Valeo Scala Laser Scanner, Bosch LRR4 77GHz Long Range Radar Sensor, Aptiv R3TR 76 GHz as mid-range radar front right and left, and rear right and left.
The Autoliv night vision camera consists of two modules, the camera, and a remote processing unit (Figure 4). The Autoliv’s infrared night vision camera consists of FLIR’s 17µm pixel high definition vanadium oxide microbolometer ISC0901. The device offers an engineering approach with a complex optical system and sophisticated numerical processing system based on an array of FPGAs and a custom algorithm.
Aptiv’s Lane Assist Front Camera is mounted on the rearview mirror and offers a range of 80 meters with a frame of 36 images/sec. The camera uses a 1.2 Megapixel CMOS image sensor provided by On Semiconductor and an 8-bit Microchip PIC microcontroller. The zFAS control unit manages the image mapping and recognition software with the Mobileye EyeQ3 processing chip (figure 5).
LRR4 is a multi-mode radar with six fixed radar antennas from Bosch. The four centrally arranged antennas offer a high-speed recording of the environment, creating a focused beam with an aperture angle of ±6 degrees with minimal traffic interference in adjacent lanes. In the near field, the two outer antennas of the LRR4 expand the field of view to ±20 degrees offering a range of 5 meters with the ability to quickly detect vehicles entering or leaving the vehicle lane (Figure 6).
Aptiv’s short-range radar sensor consists of two transmitters and four receivers channels and operates in the 76-77 GHz frequency band, which is standard for automotive radar applications. The PCB uses a monolithic microwave integrated circuit (MMIC) and cavity waveguides. The radio frequency (RF) printed circuit board (PCB) substrate uses a glass-reinforced hydrocarbon-based ceramic laminate and is completely PTFE-free (Figures 7 and 8).
Lidar systems are based on time of flight (ToF), which measures precise timing events (figure 12). The latest developments have seen several multi-beam lidar systems, which generate an accurate, three-dimensional image of the environment around the vehicle. This information is used to choose the most appropriate driving maneuvers.
Edge-emitting lasers are the original and still widely used form of semiconductor lasers. Their resonance length allows achieving high gain. Within the structure, the laser beam is guided in a typically dual hetero-structure waveguide structure. Depending on the physical properties of the waveguide, it is possible to achieve an output with high beam quality but limited output power or high output power but low beam quality (Fig. 13).
The laser used in the lidar solution has a 3-pin TO-type package with a die area of 0.27 mm2 as shown in Figure 13. The laser has a power of 75 W and a diameter of 5.6 mm. “Probably made by Sheaumann for Laser Components on 100 mm wafer,” said Fraux. The conditioning unit uses an avalanche photodiode (APD) to acquire the laser beam after passing through two lenses, one transmitting and one receiving. “APD is probably made by First Sensor on 150 mm Wafer with 8-pin FR4 LLC Package and Die Area of 5.2 mm2 (figure 14),” Fraux said.
APD is a high-speed photodiode that uses photonic multiplication to obtain a low noise signal. The APD achieves a higher S/N ratio than the PIN photodiode and can be used in a wide range of applications such as high precision rangefinders and low light level detection. From an electronic point of view, the APD requires higher inverse voltage and more detailed consideration of its temperature-dependent gain characteristics.
In addition to the two units for laser and motion control, the control hardware also consists of the main board consisting of a Xilinx XA7Z010 SoC dual-core ARM Cortex-A9, a 32-bit STMicroelectronics SPC56EL60L3 microcontroller, and a power management system consisting of a Synchronous Step-Down Regulator from ADI, Dual channel smart high-side Power Switch from Infineon, triple monolithic step-down IC with LDO from ADI and a three-phase Sensorless Fan Driver IC from Allegro. The FlexRay protocol enables data communication. A FlexRay system consists of several electronic control units, each with a controller that manages access to one or two communication channels.
An estimate of the cost per volume >100K units/year of such lidar technology could reach $150, with a good percentage of it being related to the main unit board and the laser (Figure 15).
In a lidar project, the transimpedance amplifier is the most critical part of an electronic layout. Low noise, high gain, and fast recovery make the new devices ideal for automotive applications. To achieve maximum performance, designers must pay special attention to interfacing and integrating circuits, wavelengths, and optical-mechanical alignment. These integrated circuits meet the automotive industry’s most stringent safety requirements, with AEC-Q100 qualification.