Definition

Almost all vehicle accidents are caused by human error, which can be avoided with Advanced Driver Assistance Systems (ADAS). The role of ADAS is to prevent deaths and injuries by reducing the number of car accidents and the serious impact of those that cannot be avoided.

Essential safety-critical ADAS applications include:

  • Pedestrian detection/avoidance
  • Lane departure warning/correction
  • Traffic sign recognition
  • Automatic emergency braking
  • Blind spot detection

These lifesaving systems are key to the success of ADAS applications. They incorporate the latest interface standards and run multiple vision-based algorithms to support real-time multimedia, vision coprocessing, and sensor fusion subsystems.

The moderinization of ADAS applications is the first steps toward realizing autonomous vehicles.


How does ADAS work?

Automobiles are the foundation of the next generation of mobile-connected devices, with rapid advances being made in autonomous vehicles. Autonomous application solutions are partitioned into various chips, called systems on a chip (SoCs). These chips connect sensors to actuators through interfaces and high-performance electronic controller units (ECUs).

Self-driving cars use a variety of these applications and technologies to gain 360-degree vision, both near (in the vehicle’s immediate vicinity) and far. That means hardware designs are using more advanced process nodes to meet ever-higher performance targets while simultaneously reducing demands on power and footprint. 

Advanced Driving Assistance System (ADAS) Infographic | Synopsys

What are some ADAS applications?

Significant automotive safety improvements in the past (e.g., shatter-resistant glass, three-point seatbelts, airbags) were passive safety measures designed to minimize injury during an accident. Today, ADAS systems actively improve safety with the help of embedded vision by reducing the occurrence of accidents and injury to occupants.

The implementation of cameras in the vehicle involves a new AI function that uses sensor fusion to identify and process objects. Sensor fusion, similar to how the human brain process information, combines large amounts of data with the help of image recognition software, ultrasound sensors, lidar, and radar. This technology can physically respond faster than a human driver ever could. It can analyze streaming video in real time, recognize what the video shows, and determine how to react to it.

These are some of the most common ADAS applications:

 

Adaptive Cruise Control

Adaptive cruise control is particularly helpful on the highway, where drivers can find it difficult to monitor their speed and other cars over a long period of time. Advanced cruise control can automatically accelerate, slow down, and at times stop the vehicle, depending on the action’s other objects in the immediate area.

 

Glare-Free High Beam and Pixel Light

Glare-free high beam and pixel light uses sensors to adjust to darkness and the vehicle’s surroundings without disturbing oncoming traffic. This new headlight application detects the lights of other vehicles and redirects the vehicle’s lights away to prevent other road users from being temporarily blinded.

 

Adaptive Light Control

Adaptive light control adapts the vehicle’s headlights to external lighting conditions. It changes the strength, direction, and rotation of the headlights depending on the vehicle’s environment and darkness.

 

Automatic Parking

Automatic parking helps inform drivers of unseen areas so they know when to turn the steering wheel and stop. Vehicles equipped with rearview cameras have a better view of their surroundings than traditional side mirrors. Some systems can even complete parking automatically without the driver’s help by combining the input of multiple sensors.

 

Autonomous Valet Parking

Autonomous valet parking is a new technology that works via vehicle sensor meshing, 5G network communication, and cloud services that manage autonomous vehicles in parking areas. Sensors provide the vehicle with information about where it is, where it needs to go, and how to get there safely. All this information is methodically evaluated and used to perform drive acceleration, braking, and steering until the vehicle is safely parked.

 

Navigation System

Car navigation systems provide on-screen instructions and voice prompts to help drivers follow a route while concentrating on the road. Some navigation systems can display exact traffic data, and if necessary, plan a new route to avoid traffic jams. Advanced systems may even offer heads-up displays to reduce driver distraction.

Advanced Driving Assistance System Applications | Synopsys

Night Vision

Night vision systems enable drivers to see things that would otherwise be difficult or impossible to see at night. There are two categories of night vision implementations: Active night vision systems project infrared light, and passive systems rely on the thermal energy that comes from cars, animals, and other objects.

 

Unseen Area Monitoring

Unseen area detection systems use sensors to provide drivers with important information that is otherwise difficult or impossible to obtain. Some systems sound an alarm when they detect an object in the driver’s unseen area, such as when the driver tries to move into an occupied lane.

 

Automatic Emergency Braking

Automatic emergency braking uses sensors to detect whether the driver is in the process of hitting another vehicle or other objects on the road. This application can measure the distance of nearby traffic and alert the driver to any danger. Some emergency braking systems can take preventive safety measures such as tightening seat belts, reducing speed, and engaging adaptive steering to avoid a collision.

 

Crosswind Stabilization

This relatively new ADAS feature supports the vehicle in counteracting strong crosswinds. The sensors in this system can detect strong pressure acting on the vehicle while driving and apply brakes to the wheels affected by crosswind disturbance.

 

Driver Drowsiness Detection

Driver drowsiness detection warns drivers of sleepiness or other road distractions. There are several ways to determine whether a driver’s attention is decreasing. In one case, sensors can analyze the movement of the driver’s head and heart rate to determine whether they indicate drowsiness. Other systems issue driver alerts similar to the warning signals for lane detection.

 

Driver Monitoring System

The driver monitoring system is another way of measuring the driver’s attention. The camera sensors can analyze whether the driver’s eyes are on the road or drifting. Driver monitoring systems can alert drivers with noises, vibrations in the steering wheel, or flashing lights. In some cases, the car will take the extreme measure of stopping the vehicle completely.

 

5G and V2X

This hot new 5G ADAS feature provides communication between the vehicle and other vehicles or pedestrians with increased reliability and lower latency, generally referred to as V2X. Today, millions of vehicles connect to cellular networks for real-time navigation. This application will enhance existing methods and the cellular network to improve situational awareness, control or suggest speed adjustments to account for traffic congestion, and provide real-time updates to GPS maps. V2X is essential to support over-the-air software updates for the now-extensive range of software-driven systems in cars, from map updates to bug fixes to security updates and more. 

 


Why is ADAS important?

According to the August 2016 Traffic Safety Facts Research Note by the National Highway Traffic Safety Administration (NHTSA), “The Nation lost 35,092 people in crashes on U.S. roadways during 2015.” This 7.2% increase was “the largest percentage increase in nearly 50 years.” An analysis revealed that about 94% of those accidents were caused by human error, and the rest by the environment and mechanical failures.

The opportunity to reduce car accidents is making ADAS even more critical. Automatic emergency braking, pedestrian detection, surround view, parking assist, driver drowsiness detection, and gaze detection are among the many ADAS applications that assist drivers with safety-critical functionality to reduce car accidents and save lives. 


What is the future of ADAS?

The increasing amount of automotive electronic hardware and software requires significant changes in today’s automobile design process to address the convergence of conflicting goals:

  • Increased reliability
  • Reduced costs
  • Shorter development cycles

The trend is shifting from distributed ADAS electronic controller units (ECUs) to a more integrated ADAS domain controller with centralized ECUs. This means that we are currently at what SAE International designates as Level 2 (Partial Driving Automation), where the vehicle can control both steering and accelerating/decelerating but falls short of self-driving because a human sits in the driver’s seat and can take control of the car at any time.

Levels of driving automation  | Synopsys

Shifting toward fully autonomous cars—vehicles capable of sensing their environment and operating without human involvement—requires an increase in the electronic architecture of these vehicles.

With the increase in electronic architecture comes an increase in the volume of data. To handle this data, the new integrated domain controllers require higher computing performance, lower power consumption, and smaller packaging.

The adoption of 64-bit processors, neural networks and AI accelerators to handle the high volume of data requires the latest semiconductor features, semiconductor process technologies, and interconnecting technologies to support ADAS capabilities.

The reduction of electronic modules leads to centralized computing architectures, requiring critical automotive building blocks, including processors with vision processing capabilities, neural networks, and sensor fusion. And this must be achieved while addressing the need for quality, safety, and security.

Every aspect of the car is designed to be more connected, requiring subsystem and SoC designers to expand the scope of safety measures beyond the traditional steps taken to ensure physical safety. Applying the latest embedded computer vision and deep learning techniques to automotive SoCs brings greater accuracy, power efficiency, and performance to ADAS systems.


How can Synopsys help?

Synopsys offers proven methodologies and automated solutions to strengthen system security at every stage of the development life cycle and across the software supply chain. Our goal is to enable OEMs and Tier 1 and Tier 2 providers around the world to deliver secure, software-enabled automotive technologies that keep passengers—and their data—safe at every turn. 

Our solutions can help automatically detect third-party components in source code and binaries, prioritize security vulnerabilities and licenses in use, and find critical defects and weaknesses in code during development. We also support the design phases of the development life cycle by identifying the design flaws, control defects, and asset vulnerabilities that define the overall risk to a system. 

Our approach to system security in the automotive industry is grounded in the fundamentals of technology risk management. Synopsys supports the distinct needs of the automotive industry by performing critical activities for automotive organizations, including:

  • Bus analysis, fuzzing, capture, and reverse engineering 
  • Vehicle ecosystem threat modeling and architectural risk analysis 
  • Embedded code reviews, penetration testing, and reverse engineering
  • Communications interface testing (onboard, wireless, dealer, manufacturing) 
  • Telematics, infotainment, and head-unit testing 
  • Certificate, encryption, key store analysis, and testing 
  • Program design and development 
  • Software security training

Continue Reading