New AI app helps visually impaired users find everyday objects with greater speed

Source: interestingengineering
Author: Aamir Khollam
Published: 11/25/2025
To read the full content, please visit the original article.
Read original articlePenn State researchers have developed NaviSense, a smartphone-based AI navigation app designed to help visually impaired users locate everyday objects more quickly and accurately. Unlike existing assistive tools that rely on preloaded object libraries or human support, NaviSense connects to external large language models (LLMs) and vision-language models (VLMs) to identify objects in real time based on voice commands. This approach eliminates the need for static object databases, providing greater flexibility and responsiveness. The app also offers conversational feedback by asking clarifying questions and features hand guidance, which directs users’ hands toward objects using audio and haptic cues—a capability identified as a critical user need through extensive interviews with visually impaired individuals.
In controlled tests with 12 participants, NaviSense outperformed two commercial alternatives by reducing search times and improving object detection accuracy, while users reported a more satisfying experience. The system listens for spoken requests, filters irrelevant items, and guides users precisely to their targets, addressing longstanding limitations in assistive navigation technology. The research
Tags
IoTassistive-technologyAInavigation-systemvision-language-modelssmartphone-appaccessibility