In a groundbreaking development for assistive technology, researchers from the National University of Singapore’s School of Computing (NUS Computing) have introduced AiSee, an innovative wearable device designed to assist visually impaired individuals. With the power of artificial intelligence (AI), AiSee aims to enhance the daily lives of those facing the challenges of visual impairment.
For visually impaired people, performing everyday tasks like grocery shopping can be a daunting experience. Recognizing and identifying objects is crucial for making informed decisions; this is where AiSee steps in. Developed progressively over five years, AiSee offers a novel solution to this issue by harnessing state-of-the-art AI technologies.
Lead researcher of Project AiSee, Associate Professor Suranga Nanayakkara, from the Department of Information Systems and Analytics at NUS Computing, emphasized the importance of a user-centric approach in developing AiSee. Unlike traditional approaches that involve glasses augmented with a camera, AiSee takes an alternative route. The device incorporates a discreet bone conduction headphone, eliminating concerns about stigmatization associated with wearing glasses.
AiSee’s operation is simple and intuitive. Users only need to hold an object and activate the in-built camera to capture an image. With the assistance of AI, AiSee identifies the object and provides additional information when queried by the user.
Three key components
AiSee comprises three fundamental components:
The eye: Vision engine computer software
AiSee incorporates a micro-camera that captures the user’s field of view. This software component, referred to as the ‘vision engine computer,’ is capable of extracting features such as text, logos, and labels from captured images for processing.
The brain: AI-powered image processing unit and interactive Q&A system
After taking a photo of the object of interest, AiSee utilizes advanced cloud-based AI algorithms to process and analyze the images for object identification. Users can also pose various questions about the object. AiSee excels in interactive question-and-answer exchanges thanks to its powerful language model.
The speaker: Bone conduction sound system
AiSee’s headphone employs bone conduction technology, allowing sound transmission through the skull bones. This ensures visually impaired individuals receive auditory information while remaining aware of external sounds, such as conversations or traffic noise. Environmental sounds are essential for decision-making, especially in safety-critical situations.
Unlike many wearable assistive devices requiring smartphone pairing, AiSee is a self-contained system operating independently without additional devices.
Empowering the visually impaired
AiSee’s potential impact on the visually impaired community is significant. Currently, individuals with visual impairment in Singapore lack access to assistive AI technology of this caliber. AiSee has the potential to empower them to perform tasks that typically require assistance independently. Ongoing efforts to make AiSee more affordable and accessible include ergonomic design enhancements and a faster processing unit.
Mark Myres, a visually impaired NUS student who tested AiSee, praised its inclusivity. He highlighted that AiSee strikes a balance, benefiting both visually impaired and blind individuals. The device’s versatility opens up new possibilities for a wide range of users.
Professor Suranga Nanayakkara’s team is currently collaborating with SG Enable in Singapore to conduct user testing with visually impaired individuals. The insights gained from this testing will help fine-tune AiSee’s features and performance.
From Zero to Web3 Pro: Your 90-Day Career Launch Plan