Researchers develop Kinect-based tool to translate sign language into text

Researchers using Microsoft Kinect have developed a new computer system that translates gestures in sign language into text.

Published Date
24 - Jul - 2013
| Last Updated
24 - Jul - 2013
Researchers develop Kinect-based tool to translate sign language...

Researchers from the Chinese Academy of Sciences (CAS) and Microsoft Research Asia have developed a device that can live-translate sign language into text. The device will help deaf and non deaf people communicate as well as create jobs for the disabled.

CAS has used the Microsoft Kinect device to translate typed text into sign-language and the sign-language will be translated back into text allowing a two-way conversation. The software will be free for individuals and will overcome communication barriers between deaf and non-deaf persons by allowing them to chat freely.

Kinect, as you probably know, is a 3D motion sensing input device from Microsoft. The software tracks hand gestures and then uses a 3D motion-trajectory alignment to recognize the word being signed. The project was recently put up on display at the Faculty Summit 2013 in Washington.

The system works in two modes - Translation Mode that translates physical hand or body movements into text or speech; Communication Mode that allows a person using ASL to communicate with someone else who is communicating in typed English. It has a 3D avatar that generates sign-language gestures from typed words. The system is capable to translating sentences and not just words.

According to researchers it is still a work in progress. Right now the system can just translate ASL but researchers are confident that soon they will be able to include other sign language dialects.

Guobin Wu of Microsoft Research Asia stated in a blog post about the research," We ultimately hope this work can provide a daily interaction tool to bridge the gap between the hearing and the deaf and hard of hearing in the near future."

Source: Mashable