Register for free and continue reading

Join our growing army of changemakers and get unlimited access to our premium content

Login Register

Automated system translates between sign language and English

A tech company is building the world’s first automated sign language translation solution to enable deaf and hearing people to communicate almost anywhere.

Sign language consists of far more than a variety of hand movements. The difficulty in using technology to translate lies in capturing the full extent of the non-verbal language. Hungarian company SignAll has taken up the challenge and is in the process of creating the world’s first fully automated sign language translation system. Having already created a usable prototype, the translation system is workable. The company is now looking for partners for its pilot program.

The current SignAll system uses three webcameras and a depth sensor, all of which are hooked up to an average computer and screen. The system’s modularity has been designed explicitly to enable upgrades as the technology improves over time. While signing obviously uses hand movements to communicate, the language also incorporates spacing, rhythm, intonation and facial expressions. It is those aspects of the language that technology struggles to capture. The SignAll system’s depth sensor is placed at chest height while the three cameras surround the signer. This allows for the continuous tracking and capture of the full range of movement of the hands and arms.

The computer that has been synced to the system then uses a natural language processing module to translate the signer’s movements into English. The sign language is translated into fully formed, grammatically correct sentences for a hearing person to read. The company is currently looking for organizations that regularly serve deaf and hard of hearing customers to test the system in real-time, real-life situations. SignAll is particularly interested in hearing from businesses and organizations such as schools, banks, restaurants, other retail spaces, local governments and transportation centers.

Many smart cities and businesses have already recognized the need to improve their accessibility through improved inclusivity. A bank that provides web cam advice sessions recently added a sign language option to its mortgage information meetings. And a university team has designed an app that uses the university information system and voice navigation to create personalized routes and alerts for blind students. How could existing technologies be used in new ways to help communities that are currently underserved?