Types of ADAS Sensors Used Today

25/07/2024
Types of ADAS sensors used today

In this article, you will learn about the different types of ADAS sensors. We will cover the topic in enough depth for you to understand the main ADAS sensor types used today, learn about the sensing technologies behind ADAS, and see how ADAS sensors are used in modern vehicles and autonomous cars.

This is PART 2/4 in our ADAS article series:

  • Part 1: What is ADAS?
  • Part 2: The types of ADAS sensors used today (this article)
  • Part 3: How are ADAS systems and autonomous vehicles tested?
  • Part 4: ADAS standards and safety protocols

To get straight to the point, here is the list of the main ADAS sensor types used today:

  • Video cameras
  • SONAR, also called ultrasonic sensors
  • RADAR
  • LiDAR
  • GPS / GNSS sensors

We will take a closer look at each of these in the sections below.

Overview of ADAS sensors

Illustration of ADAS sensors in a modern autonomous vehicle

Illustration of ADAS sensors in a modern autonomous vehicle

A vehicle needs sensors to replace or enhance the driver’s senses. Our eyes are the primary sensors we use while driving, but the stereo images they provide must be processed by the brain to infer relative distance and vectors in three-dimensional space.

We also use our ears to detect sirens, honking from other vehicles, railroad crossing warnings, and so on. All of this sensory data is processed by the brain and integrated with our knowledge of driving rules so that we can operate the vehicle accurately and react to unexpected situations.

ADAS systems have to do the same. Vehicles are increasingly equipped with RADAR, SONAR, and LiDAR sensors, while also receiving absolute position data from GPS sensors and inertial data from IMU sensors. The processing computers that take in all of this information and generate outputs to assist the driver or take direct action are steadily increasing in power and speed in order to handle the complex tasks involved in driving.

The sheer amount of sensor data being processed by cars and commercial vehicles today is astonishing, and it is increasing all the time. Even the impressive figures we see today will be dwarfed by the requirements of fully autonomous vehicles in the future.

As ADAS systems become more advanced, the volume of data that must be collected, synchronized, and processed in real time continues to grow. That is why ADAS is not just about sensors, but also about data processing, software, recognition algorithms, and computing capability.

Explanation of ADAS sensors in autonomous vehicles

Sensor data flow in autonomous vehicles

The coming flood of data in autonomous vehicles, as envisioned by Intel

Video cameras, also called optical imaging sensors

The first use of cameras in automobiles was the backup camera, also known as the rear-view camera. Combined with a flat video display on the dashboard, this camera allows the driver to reverse more safely into a parking space or during any maneuver that involves driving backward.

The original motivation was pedestrian safety. According to the U.S. Department of Transportation, more than 200 people are killed and at least 12,000 others are injured every year because a vehicle backs into them. These victims are mainly children and elderly people with limited mobility.

Once found only on premium vehicles, backup cameras became mandatory on all vehicles sold in the United States as of May 2018. Canada implemented a similar requirement. The European Commission also moved toward requiring a reversing camera or equivalent monitoring system on cars, trucks, vans, and buses. Japan’s Ministry of Transport likewise required backup sensors such as cameras, ultrasonic sensors, or both on all cars sold in Japan.

Backup camera display on dashboard

Backup camera image shown on the dashboard of a Toyota 4Runner

Today’s ADAS-equipped vehicles may contain multiple cameras, looking in different directions. And they are no longer only for backup safety: their output is used to build a three-dimensional model of the environment around the vehicle inside the ADAS computer system.

Cameras are used to recognize traffic signs, lane markings, and other roadway indicators, detect pedestrians and obstacles, and much more. They can also be used for security purposes, rain detection, and other convenience functions.

In the screen example from the Mobileye camera system, you can see how the system identifies and labels vehicles and pedestrians while also determining that the traffic light is green.

The output of most of these cameras is not shown to the driver. Instead, it feeds the ADAS computer system. These systems are programmed to process image streams and identify stop signs, understand that another car is signaling a right turn, or determine that a traffic light has just turned yellow.

They are also heavily used for detecting lane lines, which is critical for lane-keeping assistance. This represents an enormous amount of data and processing power, and it continues to grow as the industry moves toward self-driving vehicles.

Typical CCD architecture

Typical CCD architecture

Some of the sensor types used today are mainly CMOS and CCD. CCD sensors provide superior dynamic range and resolution. However, CMOS sensors require less power and can be less expensive because of their silicon architecture.

Both technologies are built around a rectangular array of pixels that generate electrical current corresponding to the intensity of light focused on each pixel.

Many vehicle cameras have also been optimized to see better in darkness than humans. Some systems modify the standard color sensor arrangement to improve low-light performance. Major automotive camera sensor manufacturers include Mobileye and OmniVision.

Mobileye has stated that if a human can drive a car using vision alone, then a computer can do the same. And unlike a human driver, cameras can look in all directions at once.

Watch this video produced by Mobileye, showing 40 minutes of fully autonomous driving through Jerusalem using their 12-camera system. A safety driver kept hands on the wheel for safety, but never touched the controls until the test concluded.

Tesla chose to rely on passive optical sensors, meaning video cameras, informed by radar rather than depending primarily on LiDAR. Their position is that because LiDAR works by projecting photons in the visible or near-infrared spectrum, it is easily affected by rain, dust, snow, and other atmospheric obstructions. Of course, cameras are affected as well, but Tesla uses RADAR, which can “see through” many of those obstructions better and overlay object range onto the 3D world created by the cameras.

Vision is absolutely necessary for humans to drive, but there are other ways to “see.” Bats, whales, and dolphins use echolocation to navigate. Submarines emit SONAR. Aircraft use RADAR to reflect radio frequency signals and detect the ground, other aircraft, and more. Modern ADAS systems borrow from many of these principles.

Mobileye video of autonomous driving through Jerusalem

SONAR sensors

SONAR, meaning Sound Navigation and Ranging, also referred to as ultrasonic sensing, generates high-frequency sound at around 48 kHz, which is much higher than the normal human hearing range. When commanded by the vehicle ECU, these sensors emit an ultrasonic burst and then “listen” for echoes returning from nearby objects.

By measuring the reflected sound, these sensors can detect objects close to the vehicle. Ultrasonic sensors are widely used for reverse object detection and parking assist systems in cars, trucks, and buses. They are typically placed at the front, rear, and corners of the vehicle.

Because they work by moving air and then detecting reflected sound, they are ideal for low-speed applications where the air around the vehicle is not moving too rapidly. However, because they are acoustic in nature, ultrasonic performance can degrade in extremely noisy environments.

Using sonar to sense objects behind a vehicle

Using sonar and sound to sense objects behind the vehicle

Ultrasonic sensors in a car bumper

Ultrasonic sensors are the circular “discs” on the rear of this vehicle

Ultrasonic sensors have limited range compared with RADAR, which is why they are not used for distance requirements such as adaptive cruise control or high-speed driving. But if an object is within roughly 2.5 to 4.5 meters of the sensor, ultrasonic is a less expensive alternative to RADAR.

Ultrasonic sensors are not used for full navigation because their range is limited and they cannot reliably detect very small objects. Even so, for close-range tasks such as parking maneuvers, they remain an essential ADAS component.

Interestingly, Tesla developed a method of projecting ultrasonic waves through metal, allowing them to hide these sensors throughout the vehicle to preserve exterior styling.

RADAR sensors

RADAR, meaning Radio Detection and Ranging, is used in ADAS-equipped vehicles to detect large objects in front of the vehicle. These systems typically operate around 76.5 GHz, although other frequencies from 24 GHz to 79 GHz are also used.

In principle, RADAR works by transmitting radio waves and measuring the propagation time of the reflected signal. This allows the system to determine object size, distance, and relative speed.

Because RADAR signals can extend to roughly 300 meters in front of the vehicle, they are especially important for high-speed highway driving. The high frequencies also mean that other vehicles and obstacles can be detected very quickly. In addition, RADAR can “see” through bad weather and other visibility obstructions.

Automotive RADAR applications

Applications for automotive RADAR sensors

LiDAR sensors

LiDAR, meaning Light Detection and Ranging, is used to detect objects and map their distances in real time. Essentially, LiDAR is a type of RADAR that uses one or more laser beams as its energy source. The lasers used are eye-safe.

High-end LiDAR sensors can rotate and emit eye-safe laser beams in many directions. LiDAR uses a “time of flight” receiver to measure the return time of the reflections. IMU and GPS are commonly integrated with LiDAR so the system can account for the vehicle’s motion while measuring how long the beams take to return, thereby building a high-resolution 3D model of the environment around the vehicle, commonly referred to as a “point cloud.”

Billions of points can be captured in real time to form this 3D model, scanning the environment up to 300 meters around the vehicle with accuracy at the level of a few centimeters. Some LiDAR sensors can contain as many as 128 lasers internally. The more lasers they have, the higher the resolution of the 3D point cloud.

Solid-state LiDAR sensor

Typical solid-state LiDAR sensor used in autonomous vehicles

In addition to traditional mechanically scanning LiDAR, there are now solid-state LiDAR products on the market. Developers such as Velodyne and Quanergy have commercialized different LiDAR approaches, including CMOS-based designs and optical phased arrays that steer each laser pulse without rotating mirrors.

LiDAR can detect objects with much higher accuracy than RADAR or ultrasonic sensors. However, its performance can be degraded by smoke, fog, rain, and other atmospheric obstructions. Even so, because LiDAR operates independently of ambient light, it is not affected by darkness, direct sunlight, or oncoming headlights.

LiDAR sensors are often more expensive than RADAR because of their relative mechanical complexity. They are increasingly used alongside cameras because LiDAR cannot detect color, such as the color of traffic lights, brake lights, or road signs, nor can it read text as well as a camera can. Cameras can do those tasks, but they require much more downstream processing power.

Frequency ranges of common GNSS constellations

Frequency ranges of the most common GNSS constellations in use today

GPS / GNSS sensors

In order to make self-driving vehicles a reality, we need a highly accurate positioning system. Vehicles today use Global Navigation Satellite Systems, or GNSS. GNSS is not just GPS, the system most people know by name.

GPS, short for Global Positioning System, is a constellation of more than 30 satellites orbiting the planet. Each satellite continuously transmits highly accurate timing and position data. When a receiver obtains usable signals from at least four of those satellites, it can determine its own position. The more usable signals it receives, the more accurate the result becomes.

But GPS is not the only global positioning system. Several GNSS constellations are in orbit today:

  • GPS - USA
  • GLONASS - Russia
  • Galileo - Europe
  • BeiDou - China

The best GNSS systems installed in vehicles today can use signals from two or three of these constellations. Using multiple frequencies and constellations helps reduce errors caused by signal delay, atmospheric interference, and obstructions such as tall buildings or terrain.

GNSS INS1000 sensor

GNSS INS1000 sensor

Consumer-grade GNSS provides position accuracy of roughly one meter, which is sufficient for a typical navigation system in a human-driven vehicle. But for true autonomy, centimeter-level accuracy is needed. GNSS accuracy can be improved by using regional or local augmentation systems such as SBAS and GBAS.

Some SBAS systems in use today include WAAS operated by the FAA in the United States, EGNOS developed by the European Space Agency, MTSAT in Japan, and commercial services such as StarFire, OmniStar, and Atlas. On the ground side, GBAS systems such as DGPS and NDGPS are also used to improve positioning accuracy.

A good example of how GNSS, IMU, and augmentation systems are integrated into ADAS sensors today is the ACEINNA INS1000 GNSS sensor. It is a dual-frequency L1/L2 RTK GNSS with an internal MEMS gyroscope and accelerometer IMU, compatible with GPS, GLONASS, BeiDou, and Galileo, while also supporting SBAS. With RTK correction, its stated position accuracy is 2 cm.

Urban canyon effect on GNSS

“Urban canyon” conditions, where tall buildings exceed street width, create multipath effects that reduce GNSS reception

A few examples of how GNSS and other ADAS sensors work together

When we drive into a parking structure or a covered tunnel, GNSS signals from the sky are completely blocked by the roof. The IMU can sense changes in acceleration on all axes and perform “dead reckoning” of the vehicle position until the satellites come back into view. Dead reckoning accuracy drifts over time, but it is very useful for short periods when GNSS is effectively blind.

In cities, buildings create what is known as an “urban canyon,” where GNSS signals bounce around and produce multipath interference. The IMU can help calculate through these conditions to provide critical position data, while other sensors such as cameras, LiDAR, RADAR, and SONAR continue sensing the world around the vehicle in all directions.

In real driving conditions, cameras, LiDAR, SONAR, and RADAR can provide centimeter-level relative positioning accuracy that basic GNSS cannot achieve without augmentation like RTK. At the same time, they can detect other vehicles, pedestrians, and obstacles, which GNSS was never designed to do.

Human passive stereo optical sensors

Our passive stereo optical sensors

Thoughts on ADAS sensors

A human-driven vehicle works because we have stereo vision and can infer distance and relative velocity in our brains. Even with one eye closed, we can still infer distance and size fairly well because the brain is trained by real-world experience.

Our eyes and brain also allow us to read and react to signs, follow maps, or simply remember which road to take because we know the area. We know how to check mirrors quickly so that we can look in multiple directions without turning our heads around.

Our brain knows the driving rules. Our ears can hear sirens, horns, and other sounds, and our brain knows how to react to them in context. Even a short drive to buy milk and bread involves thousands of decisions and hundreds of mechanical adjustments, large and small, using the vehicle’s hand and foot controls.

Replacing optical and audio sensing connected to a brain made up of around 86 billion neurons is not easy. It requires an advanced sensor suite and processing system that is fast, precise, reliable, and accurate.

Each sensor used in an ADAS-equipped vehicle has strengths and weaknesses:

  • LiDAR is excellent for 3D perception and works well in darkness, but it cannot see color. LiDAR can detect very small objects, but its performance degrades in smoke, dust, rain, and other atmospheric conditions. It requires less downstream processing than cameras, but is usually more expensive.
  • Cameras can determine whether a traffic light is red, green, or amber. They are very good at “reading” signs and seeing lane markings and other road indicators. But they are less effective in darkness or when the atmosphere is dense with fog, rain, or snow. They also require more processing than LiDAR.
  • RADAR can see farther down the road than other ranging sensors, which is essential for high-speed driving. It works well in darkness and when the atmosphere is obscured by rain, dust, and fog. However, it cannot create models as precise as cameras or LiDAR, nor can it detect very small objects as well as some other sensors.
  • SONAR is excellent for close-range distance measurement, such as parking maneuvers, but not suitable for long-range measurement. It can be disturbed by wind noise, so it is not effective at high vehicle speeds.
  • GNSS, combined with frequently updated map databases, is essential for navigation. But raw GNSS accuracy of one meter or more is not sufficient for fully autonomous driving, and without line of sight to the sky it cannot navigate at all. For automated driving, it must be integrated with other sensors, including IMU, and enhanced with RTK, SBAS, or GBAS.

IMU systems provide the dead reckoning that GNSS systems need when the line of sight to the sky is blocked or corrupted by multipath signals in an “urban canyon.”

These sensors complement each other and allow the central processor to create a three-dimensional model of the environment around the vehicle, so it can determine where to go and how to get there, follow driving rules, and respond to both expected and unexpected events on roads and in parking areas.

In short, we need all of them, or some combination of them, in order to achieve ADAS and ultimately autonomous driving.

Summary

When you were a child, did you ever imagine that your family car would one day be equipped with RADAR and SONAR like an aircraft or submarine? Did you imagine flat-panel displays dominating the dashboard and navigation systems connected to satellites in space? It sounds like science fiction, yet today all of that and more has become reality.

ADAS is one of the most important development directions taking place today. Of course, the parallel development of hybrid and electric vehicles is also critical in reducing greenhouse gases and fossil fuel use. But ADAS goes directly to the most important aspect of travel and mobility: human safety.

Because more than 90% of road accidents, injuries, and fatalities are caused by human error, every advance in ADAS has a clear and direct effect on preventing injury and death.