A team at the University of South Florida (USF) is working on new technology that could change how robots communicate with humans, especially in emergency situations. The project is led by Zhao Han, an assistant professor in the Bellini College of Artificial Intelligence, Cybersecurity and Computing at USF. The initiative has received a $411,578 grant from the National Science Foundation.
Han’s research focuses on using projector-based augmented reality as a means for robots to convey information directly onto real-world surfaces. Unlike traditional projectors that work best on flat, blank screens, this approach allows robots to project images and instructions onto complex environments such as piles of rubble or cluttered rooms.
“Traditionally, projectors only work well on flat, blank surfaces, like a movie screen. But those conditions rarely exist,” Han said. “An advantage of this work is that we will identify the textures and then adjust the projected image accordingly so when people see it, they’ll actually see the original image even though it is modified to work with the texture.”
The system combines computer vision, augmented reality, and artificial intelligence to help robots understand their surroundings. It enables them to find suitable spots for projecting images and adjust colors so messages remain visible even against patterned backgrounds.
In search and rescue operations where audio communication may be difficult or wireless signals unreliable, robots equipped with this technology could use light projections to guide rescuers through hazardous areas. Han explained: “During a rescue, if you talk, people may not be able to hear you and communicate. Power lines may also be down, but if we use a robotic projector, we can avoid those challenges.”
This approach does not require people to wear special glasses or carry devices; instead, robots can highlight paths or instructions directly onto the environment.
“We have the technology, but we are training the robot to better communicate with people,” Han said. “Trust is a big topic in our field. When we talk about the robot failures, we usually talk about trust, because when a robot fails, people don’t trust it anymore.”
Inside USF’s Reality, Autonomy, and Robot Experience (RARE) Lab, Han’s team tests these systems in different settings: simulated disaster zones for search and rescue drills; lecture halls filled with chairs; and mock messy homes.
Han added: “This project helps bridge the gap between robots and people. If robots can communicate clearly in messy, real-world environments, they can become more effective partners in everything from disaster response to daily living.”
The three-year project not only advances human-robot interaction but also provides students with practical experience in robotics research.
Ngoc Bao Dinh is one such student involved in the lab’s work. A sophomore majoring in computer engineering at USF who aims for a career in agricultural robotics helped refine mid-air fog screen systems used for visual guidance at construction sites or during rescues.
“Engineering should benefit the people, so I want to develop solutions that can be applied anywhere in the world to improve efficiency,” Dinh said. “I also learned skills that are used in the industry that wouldn’t be taught in classrooms. I discovered that I love the research environment where innovation and exploration are encouraged.”
Han plans further development by testing their projection system with other types of robots including robotic dogs or humanoid machines designed for navigating stairs or unstable terrain during disasters.



