Grocery shopping can be challenging for those who are blind or visually impaired. A group of scientists from the School of Computing at the National University of Singapore (NUS Computing) unveiled AiSee, a reasonably priced wearable assistive technology that uses artificial intelligence (AI) to help those with visual impairments “see” objects in their environment.
Everyday challenges are faced by those who are visually impaired, especially when it comes to identifying objects, which is essential for both simple and complex decision-making. Even though advances in AI have significantly increased the capacity for visual recognition, implementing these cutting-edge technologies in the real world is still difficult and prone to mistakes.
AiSee was created in 2018 and has been improved over the course of five years in an effort to get around these restrictions by utilizing cutting-edge AI technologies.
With AiSee, our aim is to empower users with more natural interaction. By following a human-centered design process, we found reasons to question the typical approach of using glasses augmented with a camera. People with visual impairment may be reluctant to wear glasses to avoid stigmatization. Therefore, we are proposing an alternative hardware that incorporates a discreet bone conduction headphone.
Suranga Nanayakkara, Project Lead Researcher and Associate Professor, Department of Information Systems and Analytics, National University of Singapore
To take a picture of an object, the user only needs to grasp it and turn on the built-in camera. AiSee will recognize the object with AI's assistance and respond to user queries with additional information.
How does AiSee work?
AiSee comprises three key components:
- The eye: Vision engine computer software
A tiny camera built into AiSee records the user’s field of vision. This is known as the “vision engine computer,” or the software part of AiSee. From the captured image, the software can extract features for processing, including text, logos, and labels.
- The brain: AI-powered image processing unit and interactive Q&A system
AiSee uses advanced cloud-based artificial intelligence algorithms to process and analyze the user's photo of the object of interest to identify it. In order to learn more about the object, the user can also pose a variety of questions.
AiSee uses cutting-edge speech-to-text and text-to-speech recognition and processing technology to understand user queries and identify objects. With the help of a sizable language model, AiSee performs exceptionally well in interactive Q&A sessions, allowing the system to quickly and intelligently understand user inquiries and provide relevant and accurate answers.
In contrast to the majority of wearable assistive devices that need to be paired with a smartphone, AiSee functions as a stand-alone system that doesn't require any other devices to function.
- The speaker: Bone conduction sound system
Bone conduction technology, which allows sound to pass through the skull's bones, is used in the AiSee headphones. This makes sure that people who are visually impaired can still access external sounds like traffic noise and conversations while also being able to effectively receive auditory information. This is especially important for those who are blind or visually impaired because ambient noises give important information that helps with decision-making, especially when safety is involved.
At present, visually impaired people in Singapore do not have access to assistive AI technology of this level of sophistication. Therefore, we believe that AiSee has the potential to empower visually impaired people to independently accomplish tasks that currently require assistance. Our next step is to make AiSee affordable and accessible to the masses. To achieve this, we are making further enhancements, including a more ergonomic design and a faster processing unit.
Suranga Nanayakkara, Project Lead Researcher and Associate Professor, Department of Information Systems and Analytics, National University of Singapore
“A lot of time, assistive devices seem very targeted at totally blind people or visually impaired people. I think AiSee is a good balance. Both visually impaired and blind people could get a lot of benefits from this,” noted NUS student Mark Myres, who helped to test AiSee as a visually impaired user.
User Testing and Further Enhancements
Assoc Prof. Nanayakkara and group are currently in talks to conduct user testing with visually impaired people with SG Enable in Singapore. The results will be useful in enhancing and optimizing AiSee's functionality. Additionally, a kind donation of S$150,000 has been made to support the project by B.P. De Silva Holdings Pte Ltd.
BPH’s choice to support AiSee’s development stems from its dedication to corporate social responsibility and sincere wish to improve society with a more general goal of promoting accessibility and inclusivity. Its charitable endeavors also demonstrate its faith in the revolutionary potential of technology to solve societal issues and build a more just and inclusive world.
Innovative solutions enabled by assistive technologies can change the lives of persons with disabilities, whether in supporting them to live more independently or lowering barriers to employment. As the focal agency and sector enabler for disability and inclusion in Singapore, SG Enable is happy to work with partners like NUS and B.P. De Silva Holdings Pte Ltd to leverage technology to empower persons with disabilities.
Ms Ku Geok Boon, Chief Executive Officer, SG Enable
In addition to this project, SG Enable hopes to work with NUS to investigate the ways in which artificial intelligence, HCI, and assistive technology can expand the technological options available to people with disabilities.
AiSee: AI-powered ‘eye’ for visually impaired people to ‘see’ objects
Video Credit: National University of Singapore