Sensor technologies in the modern vehicle

The modern vehicle has many eyes on the road — especially if it’s autonomous. Sensors are a key technology for the future of the car and come in many forms to not only guide the vehicle on its path but also enable features inside an increasingly digitized cockpit.

When we talk about automotive sensors, however, it’s a catch-all for several different kinds of sensors: cameras, radar, and LiDAR. Some sensors in cars today are rudimentary, while others are more advanced. As autonomous vehicles advance and cars in general get smarter, the functions they support are getting more complex, while the supporting technologies, including memory, storage, computing power, and connectivity, must form a computing platform on wheels that can keep pace.

Different Sensor Technologies Have Different Strengths

Among the many sensors in a modern vehicle are cameras, radar, and numerous types of LiDAR, depending on the function. Early applications in cars for sensors were for basic advanced driver assistance systems (ADAS) with rear-view cameras to help drivers back up more safely. The more autonomous the vehicle, the more sensors required. The sensors that collect information about the vehicle’s surrounding environment as well as inside the cabin fall into the following broad categories:

  • Cameras: The sensor that is closest to human vision, a camera collects images to be analyzed by computer algorithms. Among the images an automotive camera collects, one regards information about the environment surrounding the car, including other vehicles, pedestrians, cyclists, road signs, signals, and curb trajectories. Algorithm processing enables the object detection necessary for lane-departure warning and forward-collision warning.
  • Radar: A millimeter-wave (mmWave) radar can “see” better than a camera, making it advantageous for certain automotive applications, boasting high resolution and performance, and good directivity, while also being less prone to interference or getting affected by the weather. However, mmWave radar is limited to ADAS functions due to higher costs; it’s also less effective for identifying non-metallic objects.
  • LiDAR: A light-based sensor technology, LiDAR sends out pulses of rapid laser signals that bounce back from obstacles, including other vehicles, cyclists, pedestrians, or a mailbox at the end of a driveway. A LiDAR-based instrument measures the time it takes for a pulse to bounce back so that distance between the vehicle and the obstacle can be calculated with accuracy. Mechanical LiDAR directly drives a laser beam by using mechanical parts. While it’s more accurate, offers a 360-degree field of view, and travels longer distances, mechanical LiDAR is expensive to manufacture. Solid-state LiDAR is more affordable, mainly relying on electronic components to control the laser-emission angle, but scanning angles are more limited and it’s less accurate.
  • 3D Time of Flight (ToF) LiDAR: Another type of LiDAR that can address a growing number of short-distance automotive use cases is 3DToF LiDAR, which is scanner-less and can achieve a higher level of detail. It’s an increasingly popular type of LiDAR for many different devices, including smartphones, where it can enable a camera sensor to measure distance and volume using high-power optical pulses in durations of nanoseconds to capture depth information, typically over short distances, from a scene of interest. In an automotive environment, 3DToF LiDAR can scan and track objects, support gesture recognition and reactive altimeters, and build a 360-degree view outside the vehicle to assist with parking.

LiDAR has several advantages as an autonomous vehicle sensor. It has superior range, angular, and speed resolution, while also being much less prone to interference. LiDAR can also acquire a vast amount of information, including distance, angle, speed, and reflection intensity of an object to generate a multi-dimensional image of it. Distance is a determining factor as to whether a short-, medium-, or long-distance LiDAR architecture is best, including for numerous functions inside a car, not just outside to enable autonomous driving.

Sensors Inside and Out

No matter the level of autonomy, a greater ability to “see” what’s coming is a primary function of automotive sensor technology. But as the cockpit of the typical car becomes increasingly digitized, cameras, radar, and LiDAR all have a role to play inside the vehicle, too.

Short-distance LiDAR can monitor the state of a driver and passengers to support more advanced features. They include the adjustment of airbag deployment force and optimization of a heads-up display by detecting head positioning and recognizing specific drivers and passengers through facial recognition to adjust pre-defined preferences. Touchless controls through gesture recognition can also be enabled by LiDAR.

There are many basic functions in the automobile that could be done with gesture technology instead of by touch or voice, including environmental controls such as heat and air conditioning, music selection and volume, GPS navigation, and handling voice calls. Depending on how advanced the vehicle cockpit is, a driver could use a gesture to transfer applications from the main display to the instrument cluster and back again.

Gesture recognition technology essentially works by recognizing human movement as the method of input — specific movements correspond with a command — and that input is detected by sensors and cameras that monitor people’s motions. Aside from allowing a driver to control music/audio and incoming calls or navigate telematics systems, gesture recognition technology can detect when a driver is nodding off or is in distress due to a sudden health issue, enabling a semi-autonomous vehicle to pull over safely and call for help.

No matter the function, gesture recognition technology relies on sensors, algorithms, and artificial intelligence (AI) to detect specific gestures and act accordingly based on system training. Any sensor or camera that’s used requires an unobstructed view of a 3D area within the cockpit. Computer vision and machine learning technologies powered by algorithms and AI analyze the gestures in real time and translate them into commands in real time based on a library of hand motions already on file.

LiDAR is not just sensor technology that can enable in-cabin gesture recognition. Radar sensors using mmWave technology can provide the much-improved accuracy for in-cabin applications and are superior to cameras because of their fine motion-sensing capabilities that can detect and distinguish multiple people inside a car and even pass-through materials such as plastic, drywall, and clothing. This allows them to be less intrusive because sensors can be hidden behind a fascia and placed inside or under other materials inside the vehicle without compromising the vehicle’s aesthetics, while also maintaining passenger privacy for in-cabin monitoring systems.

For in-cabin applications, it means an mmWave radar can detect a person’s presence under even challenging environmental conditions such as bright light and darkness. mmWave radar sensors can discern the difference between a child and an adult sitting in seats, so that airbag deployment could be adjusted accordingly. mmWave can also detect who shouldn’t be in the vehicle, such as a potential intruder outside the car. A more advanced feature for radar in-cabin sensing is monitoring the heart and breathing rates of both the driver and passengers while the car is moving.

All sensing technologies being used for automotive applications are advancing and evolving — they’re getting smaller, more powerful, and less intrusive. Cameras, radar, and LiDAR combined with faster computing power is critical for enabling smarter in-cabin functions and full autonomy. But all this advancement and evolution requires that other electronics components keep pace.

More Sensors Require More Computing Horsepower

Today’s automobile can contain more than 200 sensors depending on its level of autonomy, especially as sensor technologies such as LiDAR get less expensive, smaller, and more power efficient. Sensors are also getting more data efficient and are increasingly able to take the data generated from multiple types within the vehicle to be ingested into a computing platform on wheels.

The process of combining data from multiple sensors is known as “sensor fusion,” and it enables greater accuracy. For autonomous vehicles, this fusion means safer decisions can be made even if individual sensors aren’t completely reliable on their own. The greater the number of sensors on the car, the better it can respond to various driving scenarios in real time, including other cars, people, and animals on the road.

However, all this data, as well as information from a surrounding edge computing that complements what the car can gather on its own, means automotive computing capabilities must keep up. Every car is going to need ultra-fast memory, multi-core processors, graphic engines, larger capacity and higher-speed NVME-based SSDs, and ultra-fast connectivity to ingest, store, and process the increasingly vast amounts of data generated by multiple sensor types for about all autonomous driving and in-cabin functions.


ArrowPerks-Loyalty-Program-Signup-banner-EN

See related product

AR0234CSSC00SUKAH3-GEVB

onsemi Image Sensors View

See related product

AR0234CSSM00SUKAH3-GEVB

onsemi Sensor Development Boards and Kits View

See related product

AR0234CSSM00SUKA0-CR

onsemi Image Sensors View

See related product

AR0234CSSM00SUKA0-CP

onsemi Image Sensors View

Latest News

Sorry, your filter selection returned no results.

We've updated our privacy policy. Please take a moment to review these changes. By clicking I Agree to Arrow Electronics Terms Of Use  and have read and understand the Privacy Policy and Cookie Policy.

Our website places cookies on your device to improve your experience and to improve our site. Read more about the cookies we use and how to disable them here. Cookies and tracking technologies may be used for marketing purposes.
By clicking “Accept”, you are consenting to placement of cookies on your device and to our use of tracking technologies. Click “Read More” below for more information and instructions on how to disable cookies and tracking technologies. While acceptance of cookies and tracking technologies is voluntary, disabling them may result in the website not working properly, and certain advertisements may be less relevant to you.
We respect your privacy. Read our privacy policy here