LiDAR: The Eyes of the Self-Driving Car

Probably the most crucial technology behind AVs is LiDAR, which measures distance by illuminating the target with laser light and combines with GPS, radar, and other sensors to navigate the roads.

On May 16, 1960, at Hughes Research Laboratories (now HRL Laboratories), Theodore “Ted” Maiman operated the first functioning laser. The following year, Malcolm Stitch, another researcher for Hughes Aircraft, introduced the first light detection and ranging (LiDAR) system, originally called “coherent light detection and ranging” (COLiDAR), for satellite tracking, combining laser-focused imaging and the ability to calculate distances.

 

LiDAR is a sensing method that uses light in the form of a pulsed laser to measure variable distances to objects. Using the return times and wavelengths in calculations allows creating digital 3D images of objects in the sensor field of view. LiDAR has been used in many applications, including mapping Earth from space.

 

LiDAR is fast and can be used to detect physical objects, allowing advanced driver-assistance systems (ADAS) to avoid collisions. It cannot, however, detect other hazards, such as fog, heavy winds, or rain, all of which can affect performance and safety.

 

Use of LiDAR in Early AVs

Many of the teams participating in the DARPA Grand Challenge races used some form of LiDAR to navigate the terrain.

 

Carnegie Mellon’s Red Team, led by William “Red” Whittaker and Chris Urmson, equipped “Sandstorm,” their heavily modified 1986 M998 “Humvee” with several types of terrestrial LiDAR.

 

The team wanted to provide the Humvee with sensory tools similar to the ones that humans use to help them drive. The main LiDAR had a range of 75 meters (82 yards) ahead, and three additional LiDAR sensors gave Sandstorm a wider field of view within 20 meters (22 yards).

 

Because those limited LiDAR systems had problems in a dusty environment, such as going behind other vehicles on a desert road, the CMU team equipped Sandstorm with radar and stereo camera systems to detect obstacles.

 

Similarly, the Stanford University Racing team, led by Sebastian Thrun, equipped their vehicle, “Stanley,” with five LiDAR units, a color camera, and two radar sensors to identify large objects at long distances.

 

Velodyne and Early ADAS Systems

Founded in 1983 as an audio company, Silicon Valley-based Velodyne is regarded as a LiDAR pioneer after developing the first LiDAR sensors for industrial systems.

 

In 2003, the founders of Velodyne, brothers David and Bruce Hall, took part in the first DARPA Grand Challenge race. They discovered that the existing LiDAR technology, which only scanned a single, fixed line of sight, was not suitable to navigate a 3D environment. Velodyne developed new sensors for the 2007 race.

 

During the third DARPA race, the 2007 Urban Challenge, Velodyne sold its LiDAR to at least seven of the competitors, including two each to the Carnegie Mellon and Stanford teams. The Velodyne system rotated 64 lasers, producing 1 million data points per second to create a 360° 3D map of the environment. Earlier systems produced 5,000 data points per second.

 

After Gogle’s founder, Larry Page, started Chauffeur, the company’s project for autonomous vehicles (AVs), Velodyne became the sole provider of LiDAR sensors for their AVs. In 2010, Chauffeur began testing self-driving cars on the streets in the San Francisco Bay Area using Velodyne’s LiDAR technology.

 

In 2016, Ford Motor Company and Chinese search engine Baidu committed to investing $150 million in the company. At the time, Velodyne worked with 25 self-driving car programs.

 

Velodyne’s main ADAS product is the HDL-64E, designed for obstacle detection and navigation of autonomous ground vehicles and marine vessels.

 

Recently, during CES 2020, Velodyne introduced Velabit, a $100 LiDAR sensor. They designed it to operate in many types of devices, including terrestrial vehicles and drones. According to the company’s website, “The sensor delivers the same technology and performance found on Velodyne’s full suite of state-of-the-art sensors and will be the catalyst for creating endless possibilities for new applications in a variety of industries.”

 

Other Vendors of LiDAR Technology

While Velodyne is one pioneer and a leading LiDAR vendor for ADAS and AVs, other companies are now making a dent in the market:

 

TriLumina: They are mostly popular for the design and manufacture of low-cost LiDAR sensors mainly for automotive applications. The company delivers inexpensive LiDAR illumination modules that are much smaller than alternative technologies. Some of the modules are much less than that of the size of a dime. According to TriLumina’s website, “TriLumina semiconductor laser solutions enable low-cost LiDAR for automotive, ADAS, and other applications. Its new chips will advance time-of-flight (ToF) capabilities while reducing power requirements and size, and its fast pulse technology provides dramatic improvement in performance and form factor for LiDAR.”

 

Innoviz Technologies: An Israel-based company, Innoviz is gaining a place among the industry leaders for their LiDAR solution. Innoviz offers other technologies for ADAS, including an exclusive system, detector, and MEMS design to give driverless vehicles more exceptional sensing capabilities. The latter helps in challenging conditions such as varying weather, bright direct sunlight, and multi-LiDAR environments.

 

Not All ADAS Systems Depend on LiDAR

While many of the largest automobile companies are developing new cars equipped with LiDAR for their high-end ADAS features, Tesla is not considering the technology for the company cars’ self-driving capabilities.

 

Tesla’s CEO, Elon Musk, has been mocking LiDAR for years. “They’re all going to dump LiDAR,” Elon Musk said last year at an event showcasing Tesla’s self-driving technology. “Anyone relying on LiDAR is doomed.”

 

“LiDAR is really a shortcut,” added Tesla AI guru Andrej Karpathy. “It sidesteps the fundamental problems of visual recognition that is necessary for autonomy. It gives a false sense of progress and is ultimately a crutch.”

 

Instead of LiDAR, Tesla uses fast GPU-based computer systems to analyze the images captured by the car’s cameras. They run the images through an algorithm that assigns a distance estimate to each pixel. Similar to the way the human brain calculates distances, they can do this using a pair of cameras and the parallax effect.

 

Researchers at Cornell University argue that LiDAR could be optional for fully autonomous vehicles. Using stereo high-definition cameras mounted around the car, they have been able to produce an accurate 3D image of the surroundings.

 

The Cornell researchers converted the pixels from each stereo image pair into the 3D point cloud that is generated natively by LiDAR sensors. The researchers then fed this “pseudo-LiDAR” data into existing object-recognition algorithms that take a LiDAR point cloud as an input.

 

“The self-driving car industry has been reluctant to move away from LiDAR, even with the high costs, given its excellent range accuracy — which is essential for safety around the car,” said Mark Campbell, director of the Sibley School of Mechanical and Aerospace Engineering and a co-author of the paper. “The dramatic improvement of range detection and accuracy, with the bird’s-eye representation of camera data, has the potential to revolutionize the industry.”

 

Still, Kilian Weinberger, one author of the paper, recognizes that “there’s still a fair margin between LiDAR and non-LiDAR.” The Cornell team achieved 66% accuracy on one version of the KITTI benchmark — a real-world computer-vision benchmark developed by the Karlsruhe Institute of Technology and Toyota. Using the same algorithm on actual LiDAR point cloud data produced an accuracy of 86%.

 

Apart from cost, other reasons that Tesla and others have been avoiding LiDAR are power consumption, weight, and how it comes to impact the design of the car itself. Also, the HDL-64E weighs 12.7 kg (28 lbs.) without cabling and uses about 60 W of power, something that could impact the range of a pure electric vehicle.

 

Conclusion

New, smaller, low-cost LiDAR systems such as Velodyne’s Velarray and Velabit provide car manufacturers with a range of options to enable a range of ADAS features in their vehicles. As technology advances, a combination of LiDAR and other sensors, paired with faster computing power, could put levels of self-driving features in most cars of the future.

newsletter 1

Ultime notizie

Sorry, your filter selection returned no results.

Non perderti le ultime novità sull'elettronica

Abbiamo aggiornato la nostra politica sulla privacy. Si prega di prendere un momento per rivedere questi cambiamenti. Cliccando su Accetto, l'utente accetta la Politica sulla privacy e Condizioni di utilizzo di Arrow Electronics.

Il nostro sito web mette i cookies sul vostro dispositivo per migliorare la vostra esperienza e il nostro sito. Leggete altre informazioni sui cookies che usiamo e su come disabilitarli qui. I cookies e le tecnologie di tracking possono essere usati per scopi commerciali.

Con un click su “Accept”, voi consentite l'inserimento dei cookies sul vostro dispositivo e l'uso da parte nostra di tecnologie di tracking. Per avere altre informazioni e istruzioni su come disabilitare i cookies e le tecnologie di tracking, clickate su “Read More” qui sotto. Mentre l'accettazione dei cookies e delle tecnologie di tracking è volontaria, una loro disabilitazione potrebbe determinare un funzionamento non corretto del sito web, ed alcuni messaggi di allarme potrebbero essere per voi meno importanti.

Noi rispettiamo la vostra privacy. Leggete qui la nostra politica relativa alla privacy