Debriefing the Audi A8

image

The lessons that Audi has learned from the experience of creating the A8, the world's first autonomous Level 3 car, remain relevant today. Here's what we learned after System Plus disassembled the Audi A8.

The recent analysis of the Audi A8 made it clear why, both from a technological and economic point of view, to achieve a higher level of autonomy of vehicles is more difficult than anyone had originally expected. The Audi experience with the A8 remains relevant today.

When Audi released the updated A8 sedan at the end of 2017, the company introduced it as the first level 3 car in the history of the automotive industry. The entire auto industry is still struggling with technological problems and the incomprehensibility of the value-generating structure, that is, problems that Audi was facing at that time. A System Plus study provides valuable information on several issues:

  • What is needed in order to release a car of the 3rd level?
  • What is included in the set of sensors A8?
  • How much computing power does a Level 3 car need?
  • Is Audi's central driver assistance controller called zFAS based on GPU, SoC, CPU, or FPGA?
  • How much does zFAS cost?

Audi's experience in achieving 3rd-level functionality using chips already tested in other applications and available on the market can be instructive - especially compared to Tesla, which released its Full Self Driving Computer board two years later (2019), which to provide unmanned driving systems relies heavily on two in-house chips

The System Plus disassembly procedure includes analysis that goes beyond simple reverse engineering and hardware identification. The company also conducts a “reverse cost calculation” - an assessment of costs for obtaining specific components and creating products. System Plus's A8 cost backward calculation shows that 60% of the cost of zFAS (total cost is estimated at $ 290) is determined by the cost of semiconductors. This is hardly surprising, since 80-85% of the components in modern cars are electronics. However, there is nothing surprising in this value.

Price


The real shock to OEMs, said Romain Fraux, CEO of System Plus Consulting, is that not a single car company was morally prepared to pay 50% for each component, just like Nvidia, Intel did. and other companies for their flagship chip-based solutions. This opened the door to a whole new world for car OEMs, prompting them to rethink the cost of highly automated cars.

System Plus pricing does not include software development costs for automated vehicles. However, the use of FPGA (Altera Cyclone) inside zFAS shows Audi's attempt to save its own software assets that it has already developed.

Over the past 18 months, some of the leading OEMs have begun to hint at their desire to develop their own stand-alone automotive chips like the ones made by Tesla. This approach allows them to manage their own fate in terms of developing hardware and software. However, given the high cost of developing chips, it is unclear whether OEM car manufacturers should do this alone.

Another important aspect of the A8 is that Audi was the first among all automotive OEMs to launch a commercial vehicle on the road to autonomy.

At the time of the launch of the A8, the technology inside the car was presented as a “breakthrough in the field of automated driving,” with the Traffic Jam Pilot system. It is assumed that when the Traffic Jam Pilot system is activated, a person is freed from the need to control the flow, where you often need to gas and stop.

However, these very best plans were faced with the problem of transferring control (to warn and engage a distracted person in a situation when the computer could not cope), which from the very beginning was indicated by the concept of third-level cars.

Today, the A8 travels the streets, but none of these cars have activated and function autonomy of the 3rd level.

However, this is not a claim to Audi. The A8 made the auto industry understand what it was facing. Industry leaders must deal with all the regulatory, technical, behavioral, legal, and business complications before they can talk about a utopian future with unmanned vehicles. This partly explains the growing interest in developing safety standards among car OEMs, leading market players, chip suppliers, as well as technology and service companies (such as Waymo and Uber).

A8 under the hood


The challenge for automakers will no longer be to offer maximum speed or better acceleration from zero to 100 km / h, but to provide increasingly sophisticated driver assistance and autonomous driving systems. This is the goal of the Audi A8 with a third-level autonomous driving system, the first to use lidars.

A8 sensors also include cameras, radar, and ultrasonic sensors. Audi A8 will cope with driving on the busiest roads without driver intervention. Audi indicates that the driver can always keep his hands on the steering wheel and, depending on local laws and regulations, can engage in other activities, such as watching TV on board. A vehicle can solve most of the problems that arise on the road, but human intervention is necessary (Fig. 1).

image

Figure 1: Key elements of the Audi A8

Fro commented on Audi A8's list of innovative technologies: “Audi is the first car with a 3rd level of autonomy.” The Traffic Jam Pilot system installed on the Audi A8 takes responsibility for driving in slow flow at speeds up to 60 km / h on motorways and highways, using a combination of sensors and the world's first laser scanner. (Note: this third level function is still not activated).

Level 3 Autonomy and Computing Platform


Digital technology can perform the same tasks as the driver, while providing greater safety and ride comfort. The long-term goal is to create fully networked roads - an automotive intelligent network. Traffic congestion and environmental pollution will be reduced, which will lead to a significant increase in safety.

Autonomous driving is becoming an increasingly relevant topic in the automotive world; On the agenda are news about progress and innovations in this area. The level 3 used for the Audi A8 is characterized by highly automated driving. The system is able to save the driver from the need for constant control over the longitudinal and transverse movement of the car.

Fro said, “The Audi A8 consists of a variety of sensors and a zFAS controller with four processors assembled by Aptiv.” zFAS (Fig. 2) is the first centralized computing platform. The computer as a central unit processes in real time the signals of ultrasonic sensors (front, rear and side), 360-degree cameras (front, rear and side rear-view mirrors), medium-range radar (at any angle), as well as data from the radar long-range and laser scanner on the front of the vehicle.

image

Figure 2: Aptiv zFAS Controller

A bunch of processors in zFAS


One of the processors that make up the platform is the Nvidia Tegra K1, used for traffic light recognition, pedestrian detection, collision warning, light detection and lane recognition. Tegra K1 with 8-layer PCBs contains 192 Cuda cores, as many as Nvidia integrates into one SMX module inside the Kepler GPUs currently on the market (Figure 3) with support for DirectX 11 and OpenGL 4.4.

image

Figure 3: Nvidia Tegra K1

The presence of a very powerful processor in a car is of great importance when it comes to the number of sensors built into it. The Intel / Mobileye EyeQ3 processor is responsible for image processing. To meet energy and performance requirements, EyeQ chip systems are designed using finer stamping. Speaking of Eye3, Mobileye uses 40 nm CMOS, while the company will use 7 nm FinFET 5th generation systems based on the EyeQ5 chip. Each EyeQ chip is equipped with uniform, fully programmable accelerators; in addition, each type of accelerator is optimized for its own family of algorithms.

Curiously, the Nvidia Tegra K1 and Mobileye EyeQ 3 cannot handle all of the ADAS tasks expected for Tier 3 cars. Inside zFAS, there is Altera Cyclone for data preprocessing and Infineon Aurix Tricore for monitoring security operations. Altera Cyclone's FPGA family of devices is based on 1.5 V, 0.13 microns, multilayer copper static RAM, with a density of up to 20,060 logic elements and possessing up to 288 kbps of RAM.

Infineon Aurix architecture is designed to optimize performance in power plants and automotive security systems. TriCore is the first unified single-core 32-bit digital signal processing microcontroller architecture optimized for real-time embedded systems.

Sensors in the Audi A8


In the automotive world, advanced driver assistance systems have become a must for all new cars that want a higher Euro NCAP rating. On the first page, in Figure 1, we were able to find a detailed list of Audi A8 devices found by System Plus. “Manufacturers are developing more and more efficient radars, we can distinguish a number of companies in the market: Aptiv, Veoneer, ZF, Valeo, Bosch, Mando, Denso and Ainstein,” said Fro.

In particular, on the Audi A8 we can see the 3rd generation Autoliv night vision car camera, the Aptiv Lane Assist Front Camera, the Valeo Scala laser scanner, the Bosch LRR4 77GHz long-range radar, the Aptiv R3TR 76 GHz as a medium-range radar, mounted to the right and left in the front and rear of the car. "

Autoliv night vision camera consists of two modules - a camera and a remote processor (Fig. 4). The Autoliv infrared night-vision camera consists of a high-resolution 17-micron pixel FLIR microbolometer based on vanadium oxide ISC0901. The device is based on an engineering approach with a complex optical system and a modern numerical processing system based on an array of PPVM and a specialized algorithm.

image

Figure 4: Autoliv 3rd Generation Night Vision Car Camera

The Aptiv Lane Assist front camera is mounted on the rear view mirror and has a range of 80 meters with a frequency of 36 frames / sec. The camera uses a 1.2-megapixel CMOS image sensor provided by On Semiconductor and an 8-bit PIC microcontroller from Microchip. The zFAS control unit provides software control for image processing and recognition using the Mobileye EyeQ3 processing chip (Fig. 5).

image

Figure 5: Aptiv Lane Assist Front Camera circuit board circuit board

The LRR4 is a multi-mode radar with six Bosch fixed antennas. Four antennas located in the center provide high-speed recording of the environment, creating a focused beam with an aperture angle of ± 6 degrees with minimal interference with traffic in adjacent lanes. In the near field, two external LRR4 antennas expand the field of view to ± 20 degrees, providing a range of 5 meters with the ability to quickly detect vehicles entering or leaving the lane (Fig. 6).

image

Fig. 6: Long-range radar sensor (Image: System Plus).

The Aptiv short-range radar sensor consists of two transmitters and four receiving channels and operates in the 76-77 GHz frequency band, which is standard for automotive radars. The circuit board uses a monolithic microwave integrated circuit (MMIC) and a waveguide resonator. A glass-reinforced ceramic hydrocarbon-based laminate that does not contain PTFE is used on a substrate of a radio frequency (RF) printed circuit board (PCB) (Figs. 7 and 8).

image

Figure 7: Aptiv R3TR 76 GHz Short Range Radar Overview Figure 8: Aptiv R3TR 76 GHz Short Range

image

Radar Electronics Board

Lidar technology


A key element of the Audi A8 is lidar. This is the first time a car manufacturer has used a laser scanner. This lidar is based on a mechanical system with a rotating mirror and a wavelength of 905 nm and uses radiation technology at the edges. The device has a range of 150 meters with a viewing angle of 145 ° horizontally and 3.2 ° vertically. The motor control device consists of a stator, a rotor with a control drive and an MPS40S Hall sensor for motion detection. The Hall sensor changes its output voltage in response to a magnetic field. This is a long-term solution, as there are no mechanical parts that could wear out over time. An integrated circuit reduces the size of the system and the relative complexity of the implementation (Fig. 9, 10, 11).

Lidar systems are based on flight time (ToF), which allows accurate measurement of time-related events (Fig. 12). Recent developments have allowed the creation of several multipath lidar systems that form an accurate three-dimensional image of the environment around the vehicle. This information is used to select the most suitable driving maneuvers.

image

Figure 9: Laser scanner (Image: System Plus)

image

Figure 10. Inside of the laser scanner

image

Figure 11: Block diagram of the laser scanner

image

Figure 12: Functional chart of flight time (Figure: Maxim Integrated)

Edge radiation lasers are an original and still widely used form of semiconductor lasers. Their resonant length allows to achieve a high gain. The laser beam inside the structure is usually directed into a double homogeneous waveguide system. Depending on the physical properties of the waveguide, it is possible to achieve an output with high beam quality, but limited output power or high output power, but low beam quality (Fig. 13).

image

Figure 13: Edge Laser Diode

The laser used in the lidar system has a 3-pin TO type package with a matrix area of ​​0.27 mm2, as shown in Fig. 13. The laser power is 75 watts and has a diameter of 5.6 mm. “It was probably made by Sheaumann for laser components on a 100mm plate,” said Fro. The matching device uses an avalanche photodiode (APD) to receive a laser beam after passing through two lenses - one transmitting and one receiving. “The APD is probably made by the first sensor on a 150 mm plate with an 8-pin FR4 LLC package and a 5.2 mm connection area (Figure 14),” Frau said.

APD is a high-speed photodiode that uses a photomultiplier to produce a low-noise signal. The APD achieves a better signal-to-noise ratio than the PIN photodiode, and can be used in a wide range of applications (such as high-precision rangefinders and low-light detectors). From an electronic point of view, APD requires a higher reverse voltage and a more detailed consideration of its temperature-dependent gain characteristics.

image

Figure 14: Avalanche Photodiode (APD)

In addition to two laser and motion control units, the control equipment also consists of a main board consisting of an ARM Cortex-A9 Xilinx XA7Z010 SoC dual-core processor, a 32-bit STMicroelectronics SPC56EL60L3 microcontroller and a power management system consisting of a synchronous step-down regulator from ADI, a two-channel intelligent boost power switch from Infineon, a triple monolithic step-down chip with LDO from ADI and a three-phase chip Sensorless Fan Driver from Allegro. FlexRay protocol provides data exchange. The FlexRay system consists of several electronic control units, each of which is equipped with a controller that controls access to one or two communication channels.

The calculation of the cost of one such lidar system with a volume of> 100,000 units / year can reach 150 US dollars, while a significant part of it is connected with the main unit board and the laser (Figure 15).

image

Figure 15: Disassembled Laser Scanner Equipment

In a project using lidars, a transimpedance amplifier is the most important part of an electronic system. Low noise, high gain and fast recovery make the new devices ideal for automotive applications. To achieve maximum performance, designers should pay particular attention to the pairing and integration of circuits, wavelengths and optical-mechanical alignment. These integrated circuits meet the most stringent automotive safety requirements in accordance with AEC-Q100 qualifications.



image

About ITELMA
- automotive . 2500 , 650 .

, , . ( 30, ), -, -, - (DSP-) .

, . , , , . , automotive. , , .

Read more useful articles:


All Articles