Register for free and continue reading

Join our growing army of changemakers and get unlimited access to our premium content

Login Register

App is a high-tech seeing eye dog

The Seeing AI app, for visually impaired people, uses object recognition and natural language processing to describe their surroundings.

One of most exciting applications of computer vision and image recognition is to help blind and visually impaired people understand the world around them. We have already seen BlindTool use image recognition technology to identify 3D objects and convey them verbally for blind users, and now, the Seeing AI app created by developers at Microsoft could take these abilities even further, by providing real-time auditory description of the world around them.

https://www.youtube.com/watch?v=3WP7Id8SxYQ

The Seeing AI app can be used via smartphones or Pivothead smart glasses. The user begins by holding up the camera on their phone, or tapping their glasses to prompt the device to ‘look’ at their environment for them. Then, it provides an auditory description of what it sees, either through an earpiece or small speaker. The app uses natural language processing to describe their surroundings — everything from objects to text to the expression on their companion’s faces.

seeingai-1-app-smartglasses-blind-visually-impaired

The app was recently unveiled at the Microsoft Build conference, but is still in development. Could these capabilities be used in education, or for language translation when traveling?