Artificial Intelligence helps people with visual impairment

Artificial Intelligence (AI) developer Jagadish K. Mahendran and his team designed a backpack with voice-activated artificial intelligence technology that can help people living with visual impairments navigate and perceive the world around them, Intel said in a statement.

It added that the backpack helps detect traffic signs, suspended obstacles, crosswalks, moving objects and changing elevations, all while running on a low-power interactive device.

“Last year, when I met with a friend who lives with a visual impairment, I was struck by the irony that while I teach robots to see, there are many people who can’t see and need help. This motivated me to build the visual assistance system with OpenCV’s OpenCV Deep AI Kit (OAK-D), powered by Intel,” said Mahendran, of the University of Georgia’s Institute for Artificial Intelligence.

The World Health Organization estimates that globally, 285M people live with visual impairment. At the moment, visual assistance systems for navigation are limited and range from voice-assisted smartphone applications based on the Global Positioning System (GPS) to camera-activated smart cane solutions. These systems lack the depth perception needed to facilitate independent navigation, the company detailed.

How it works

The system is housed inside a backpack containing a central computing unit, similar to a laptop computer. A camera is concealed in a vest, and a battery capable of providing approximately eight hours of use is hidden in a fanny pack. A Luxonis OAK-D spatial artificial intelligence camera can be added to the vest or fanny pack, and then connected to the computing unit in the backpack. Three small holes in the vest serve as viewing windows for the OAK-D, which is placed inside the vest.

The OAK-D unit is an AI device running on the Intel Movidius VPU and the Distribution of OpenVINO toolkit for edge-on-chip AI inference. It has the ability to run advanced neural networks while providing accelerated computer vision capabilities and real-time depth mapping from its stereo pair, as well as color information from a single 4k camera, Intel abounded.

A Bluetooth-enabled headset allows the user to interact with the system through voice queries and commands, and the system responds with verbal feedback. As the user moves around, the system audibly relays information about common obstacles, including signs, tree branches, and pedestrians. It also warns of nearby crosswalks, sidewalks, stairs, and driveways, Intel said

¡Stay tune! Go and visit our blog. New articles every week.

Share in your social networks!

Leave a Comment

Your email address will not be published.