Most research initiatives promise a glimpse beyond the curtain of what is known, but for imaging specialist Suren Jayasuriya, a glimpse isn’t nearly enough.
Jayasuriya is an associate professor of electrical and computer engineering in the School of Electrical, Computer and Energy Engineering, part of the Ira A. Fulton Schools of Engineering at Arizona State University, with a joint appointment with The GAME School, whose research spans computer vision and visual processing, with a focus on field of non-line-of-sight sensing.
From self-driving cars to disaster relief, the growing need to detect objects beyond a sensor’s immediate field of view has become increasingly important, driving Jayasuriya to collaborate with researchers around the world to advance these capabilities.
His work has earned him recognition as a 2026 National Institute of Information and Communications Technology, or NICT, Foreign Research Invitation Fellow, an honor awarded to only about eight scholars worldwide each year.
The fellowship will support Jayasuriya’s two-month research visit to Chiba University in Japan during the summer of 2026, where he will work with longtime research collaborator Associate Professor Hiroyuki Kubo. Their work together has resulted in several awards and published papers. Together, the two researchers will explore non-line-of-sight sensing principles with computational imaging.
Jayasuriya says he is honored to receive the award and advance his research.
“It feels like science fiction — but advances in algorithms, machine learning, computer graphics and vision have enabled this new capability of non-line-of-sight imaging,” he says. “Working with researchers at Chiba University will allow my students and me the opportunity to develop new research avenues in this problem space.”
Enabling robots to “see” and “hear” around corners
Non-line-of-sight sensing is a form of computational imaging that infers information from scenes hidden from direct view, such as objects around corners or behind obstacles. Rather than capturing a straightforward image, these systems analyze subtle reflections of light or sound bouncing off surfaces in the environment to detect, track and even reconstruct what lies beyond.
While the concept has advanced significantly in recent years, challenges remain in improving reliability, range and real-world performance — areas that Jayasuriya’s research seeks to address.
He and Kubo will explore how combining optical and acoustic sensing can strengthen non-line-of-sight sensing systems. By integrating light-based imaging with sound-based detection, they aim to develop hybrid approaches that allow autonomous platforms to effectively “see” and “hear” around corners.
Such advances could transform how robots and autonomous systems perceive complex environments, improving collision avoidance and enabling more effective search-and-rescue and disaster response efforts in low-visibility conditions.
By pushing the boundaries of what sensors can detect, this research seeks to make intelligent systems safer, more adaptable and better equipped to operate in uncertain conditions.
Kubo says that much of their past collaboration has centered on information processing techniques specialized for optical systems, but this new project explores opportunities in new domains.
“Global collaborations are important because different regions face different challenges,” Kubo says. “By sharing those challenges and combining our knowledge and perspectives, we can create value that would be difficult to achieve within a single country or institution. These collaborations also accelerate innovation and help train the next generation of researchers in an international environment.”

A hands-on, cross-cultural research experience
During this fellowship, Jayasuriya will engage with researchers in Kubo’s lab to generate new ideas and hypotheses, design experiments, analyze data and prepare joint scientific publications and presentations.
Additionally, Jayasuriya will supervise and mentor undergraduate and graduate students in the lab as they carry out experiments in computational imaging, participate in research discussions and refine their scientific writing and presentation skills in English.
The exchange will extend beyond his own visit as Jayasuriya plans to sponsor Omkar Vengurlekar, an ASU computer engineering doctoral student from his lab, to spend four weeks at Chiba University working directly with the research team, further strengthening the partnership.
The goal of the collaboration is to present scientific publications at leading international venues in computational imaging and photography, machine learning, computer vision and graphics. After the visit, the research teams will continue their collaboration remotely to ensure long-term success.
Vengurlekar says he is looking forward to the opportunity.
“I’m excited about the chance to exchange ideas, learn from a new research environment and see how these different threads of imaging research can inform one another,” Vengurlekar says. “Non-line-of-sight technology pushes us to rethink what an imaging system can measure and reconstruct from indirect light transport, which makes it both practically valuable and scientifically exciting.”
Advancing global collaboration at the Fulton Schools
The NICT Foreign Research Invitation Fellowship recognizes Jayasuriya’s contributions but also supports ASU’s growing reputation in computational imaging and robotics research.
The Fulton Schools has increasingly emphasized interdisciplinary and international collaboration in areas where art, design, engineering and computing intersect. By combining complementary expertise and perspectives, the the ASU and Chiba University research teams are positioned to explore new sensing paradigms that would be difficult to achieve in isolation.
“We are still at the nascent stages of development, but I’m really excited about the future of advanced imaging systems,” Jayasuriya says, “especially when these systems can leverage machine intelligence to reveal hidden information not easily captured by the human eye.”
As robotics and autonomous systems become more integrated into everyday life, the ability to detect objects beyond direct sight will be increasingly important. Through this research partnership, Jayasuriya and his collaborators are working to improve how machines sense and interpret complex environments.



