Sensors and edge AI: How they work together to provide autonomous driving

What do self-driving cars, smart robots, autonomous machines, and unpiloted drones have in common? They all operate using edge computing technology. Edge computers are purpose-built computers intended to operate close to a data source – that is, they aren't reliant on remote processing power.

While this concept may seem trivial, many engineers regard edge as the newest frontier in computing. As the technology improves, it may revolutionize many industries. Of course, even the most powerful computers are useless without data, which remains true for edge computing.

Here, we'll explore the top sensors used in autonomous-vehicle edge computing applications. We'll also look at how companies continually use iterative machine learning to train their edge computing devices.

What Is Edge Computing?

The computing world occupies a drastic range of processing capabilities and methodologies. As a general rule of thumb, the processing power of computers and data centers is measured by the amount of Floating-point Operations Per Second (FLOPS) they can process.

In 2022, the most powerful computer on the planet is the Fugaku supercomputer in Japan, running at 442 petaflops, or 442,000 TOPS (teraflops - trillion operations per second). Generally, computers that specialize in machine learning, artificial intelligence, and parallel computing are benchmarked using the TOPS measurement. Over time, the increase in supercomputer speed capacities has grown steadily towards exaflop (1,000,000 TOPS) speed ranges.

As a real-world reference, an iPhone 13 runs on Apple's A15 Bionic chip and can achieve 15.8 TOPS. Keep in mind that today's phones are more powerful than the world's fastest supercomputers just two decades ago.

Data centers are famed for both their incredible processing power and constant use for machine learning and neural network training applications. However, utilizing their processing speed requires a connection to the data center, whether direct or virtual.

Edge computing technology is heavily utilized in applications where a connection to a data center isn't possible or desirable. For example, suppose a large amount of real-time video data is collected on a moving vehicle. In that case, you may only run AI image recognition locally rather than at a data center.

This is because wireless internet connection speeds cannot support the necessary real-time data transfer using a standard cloud-computing data model.

The Edge Advantage

Let's consider the example of a cyclist: If the proper AI identification of the cyclist is the difference between the car avoiding the cyclist and crashing into him, then all safety measures should be taken to ensure obstacle avoidance algorithms can identify and react as quickly as possible.

While a data center may theoretically identify the cyclist more effectively than a computer deployed at the "edge," an edge computer has zero communication latency within the self-driving car, its sensors, and its control systems.

So, the term "edge computing" applies to any computational device capable of processing data at the source - or edge - rather than at a data center, workstation, or another fixed-location computer.

Edge computing allows for real-time data processing in that it uses complex algorithms to action the data in real-time. As such, edge computing is desirable when real-time data interpretation is needed, and in cases where connection to external processing servers isn't feasible or secure.

However, edge computers don't have the processing power commonly found in a data center, nor are they allowed to consume as much energy as a standard data center. Instead, modern edge computers can sustain processing speeds from under 1 TOPS to a few hundred TOPS.

NVIDIA, an Arrow.com partner and industry leader in edge computing, offers a range of edge computing solutions from the JETSON Nano, which runs 0.472 TOPS, to the Jetson DRIVE AGX Pegasus, which runs up to 320 TOPS. Check out our article on how to set up the NVIDIA JETSON Nano to learn more.

Self-Driving Car Sensors

Given its real-time data processing capabilities, edge computing has naturally established itself as a pillar in autonomous vehicle technology. However, this data isn't generated by the computer but rather by the multitude of sensors that comprise an autonomous vehicle's peripheral "eyes" and "ears."

Sensor topology can vary widely amongst autonomous vehicles, even within the same sector. Take, for example, the wide range of sensor suites among top competitors in the autonomous automotive personal vehicle space, like Tesla, Cruise, and Aurora Innovations, none of which have successfully achieved Level 5 autonomy self-driving capabilities.

To learn more about the different levels of autonomous driving AI, check out The 5 Levels of Autonomous Vehicle Technology.

Meanwhile, autonomous delivery robots, such as a Nuro vehicle, and other vehicles that do not carry humans, such as Google Waymo's street view car, have come much closer to achieving full self-driving abilities.

Most self-driving sensors are fundamentally similar - they collect data about the world around them to help pilot the vehicle. For example, the third-generation Nuro vehicle contains cameras, radar, Lidar, and thermal cameras to provide a complete, multi-layered view of the vehicle's surroundings.

Currently, a Tesla utilize eight cameras,12 , and a forward radar system, but rely much more heavily on camera visuals than Nuro vehicles. Google's Waymo Driver primarily relies on Lidar and uses cameras and radar sensors to help map the world around it.

How Machine Learning Trains AI in Self-Driving Cars

The value of the sensor data collected in all self-driving cars and vehicles depends on the compute methodologies downstream of the sensors themselves. In many ways, the most valuable intellectual property of companies like Tesla, Waymo, Aurora Innovations, and Nuro is the software and data infrastructure built to process and action the sensor data.

The winner of the race to full autonomy won't be declared by the company with the best sensor hardware, but instead by who can utilize their sensor data the most accurately, effectively, and safely.

Today, all autonomous vehicles on the road utilize edge computing AI programs, which are often trained using data center machine learning models. Autonomous car machine learning models are only made possible by the incredible computing power of modern data centers capable of hundreds of petaflops.

The computing requirements of these vast machine learning models well exceed the computing power of edge computers. Given this information, data centers are often used to form algorithms deployed for edge.

As a metaphor, it takes years for someone to learn advanced algebra, but once the algebra is understood, a person can complete algebra problems quickly and accurately.

Training an AI algorithm is similar; it takes hundreds of compute hours on a high-power data center. Yet once that algorithm is learned, it can quickly and accurately utilize that algorithm using much less computing power.

How Autonomous Vehicles Implement and Use AI

Autonomous vehicle technology is entirely dependent on the success of artificial intelligence, the accuracy of vehicle sensors, and powerful machine learning data centers.

While it may take hundreds or thousands of petaflops of computing power to process machine learning models for autonomous vehicle applications, these highly complex algorithms can be processed on the edge using hardware capable of just a few hundred TOPS. With computing power as vast as modern edge computers allow, there is little doubt that fully autonomous vehicles are near on the horizon.


ArrowPerks-Loyalty-Program-Signup-banner-EN


Latest News

Sorry, your filter selection returned no results.

We've updated our privacy policy. Please take a moment to review these changes. By clicking I Agree to Arrow Electronics Terms Of Use  and have read and understand the Privacy Policy and Cookie Policy.

Our website places cookies on your device to improve your experience and to improve our site. Read more about the cookies we use and how to disable them here. Cookies and tracking technologies may be used for marketing purposes.
By clicking “Accept”, you are consenting to placement of cookies on your device and to our use of tracking technologies. Click “Read More” below for more information and instructions on how to disable cookies and tracking technologies. While acceptance of cookies and tracking technologies is voluntary, disabling them may result in the website not working properly, and certain advertisements may be less relevant to you.
We respect your privacy. Read our privacy policy here