Register for free and continue reading
Join our growing army of changemakers and get unlimited access to our premium content
Assisted technology recognizes objects in the near environment, as well as hand gestures, delivering audio information to the wearer through bone conduction.
The Navatar app has already provided one way for those with sight problems to navigate their way around buildings using GPS-like technology. Now the OrCam is a headset that recognizes objects in the near environment, as well as hand gestures, delivering audio information to the wearer through bone conduction.
The small camera and computer mounts onto any pair of glasses, and is able to detect the things in the user’s view. Designed for those with partial vision, wearers interact by pointing to objects, text, places and even people they want to identify. The information is then provided in audio form via bone conduction, meaning interaction with the device is private and discrete. The device can be used to find out the name of products in the supermarket, read text that is too small to see, determine if the upcoming bus is the right one, or see if a friend is at a meeting place or not. Users can also teach the device to recognize certain objects that are used regularly. The video below shows the OrCam in action:
Although developed for the visually impaired, it’s possible that such a device could be useful for those with other conditions, such as dementia or dyslexia. Are there other ways computer vision can be used to help those with poor sight?
Image source: Pixabay
Please login or Register to leave a comment.