NVIDIA grant paves path to smarter, safer autonomous drones

ASU researcher Dajiang Suo receives a grant of high-powered workstations from NVIDIA to help advance his work related to autonomous drones.

Imagine someone is reported missing during a natural disaster and one of the resources available to first responders is a drone equipped to help locate the missing person quickly and efficiently.

Time is not always on the side of the searchers with this type of assistance, but the way we come to the aid of people in this situation can change for the better.

Dajiang Suo, an assistant professor of engineering in The Polytechnic School, part of the Ira A. Fulton Schools of Engineering at Arizona State University, is trying to make that change happen through his work with drones.

Suo recently received a grant from NVIDIA to advance his current research. Last fall, four RTX PRO 6000 Blackwell Max-Q Workstations were donated to his lab. This equipment will enable large-scale simulations, faster model training and improved artificial intelligence, or AI, inference.

“The new server will be used to render diverse urban scenarios in which unmanned aerial vehicles, or UAVs, operate within simulated environments,” Suo says. “More importantly, it will support the development and training of AI models capable of recognizing distinctive electromagnetic signatures from civil structures to enable robust localization.”

Suo leads the Autonomous Sensing and Resilient Connectivity Lab, or ARC Lab, located in the Simulator building on ASU’s Polytechnic campus. The lab includes an indoor testing platform that can handle simulations at this scale.

Suo’s project, Diffusion-Enhanced Scalable Radio Frequency Sensing in Safe Autonomy, applies generative AI to help drones navigate safely when Global Positioning System, or GPS, signals are unavailable and visibility is limited for vision-based navigation.

Suo working in his lab with his doctoral students. Photographer: Erika Gronek/ASU

When vision-based navigation falls short

In cities, dense high-rise buildings, narrow street corridors and surrounding infrastructure can interfere with navigation signals, sometimes blocking or distorting GPS to the point where it is no longer reliable. To navigate safely, UAVs often rely on recognizing landmarks on the ground, similar to how people use familiar buildings, street signs or intersections to find their way.

Just as a person might orient themselves by spotting a corner coffee shop or a distinctive mural when walking through a city, a UAV can use camera images to detect and recognize ground features to understand where it is and where it should go.

However, this vision-based localization is easily disrupted. Fog, rain, low light, glare or repetitive architectural patterns can make recognizing landmarks difficult or impossible for onboard cameras. In these scenarios, radio frequency, or RF, sensing, offers a powerful alternative. Onboard UAV radars transmit and receive signals reflected from ground structures, enabling researchers to extract their unique electromagnetic signatures.

Unlike cameras, RF signals are largely unaffected by lighting or weather. Yet, they can still capture distinctive signatures of landmarks, allowing UAV to localize themselves even when visibility is poor.

In the real world, radar signals are far more complicated than they appear in theory. When a drone sends out a radar signal and receives echoes from the ground, those signals are shaped by many factors, such as reflections from nearby buildings, differences between materials like concrete and metal, uneven surfaces, gaps between structures and changing environmental conditions.

These interactions can dramatically alter how a single landmark appears to the drone’s sensors. Because of this complexity, collecting sufficient real-world training data by repeatedly flying drones, is nearly impossible since no team could realistically test every possible combination of environments, materials and conditions.

Training drones in simulations

To overcome this challenge, Suo’s team uses advanced computer simulations that allow a drone to virtually “fly” thousands of times in virtual neighborhoods, each time under slightly different conditions. This approach exposes the drone’s navigation system to a wide range of signal variations, helping it recognize locations more reliably when real-world signals are noisy, distorted or unlike anything it has seen before.

Four RTX PRO 6000 Blackwell Max-Q Workstations that were donated as a part of the NVIDIA grant awarded to Suo. Photo courtesy of Shaozu Ding

Tasks under consideration include delivering life-saving packages, supporting emergency response and monitoring transportation networks during extreme events. Suo notes that his solution can be one of the ways to achieve the balance between cost and safety when building these technologies.

“One thing we can do is put the drone through operations testing and use generative AI to build all the synthetic data in order to create those scenarios,” Suo says. “The drone will experience all of the necessary scenarios in this simulation to be able to function in a more cost-effective way.”

By shifting much of this work into simulation environments developed in his lab, Suo’s approach offers benefits beyond technical performance. he is looking forward to the benefits this can eventually provide for the community.

“Beyond technical advantages, this approach has an important community impact,” Suo says. “Large-scale UAV flight testing can be disruptive and intrusive to neighborhoods. By moving these iterations into a synthetic environment powered by NVIDIA’s computing support, the project minimizes noise, safety risks and community disturbance, while still preparing UAVs to operate reliably in complex urban settings.”

Avatar photo

Joy Gaeraths

Joy Gaeraths joined the Ira A. Fulton Schools of Engineering marketing and communications team in February 2024 as the embedded communications specialist for The Polytechnic School.

Media contact: