Virtual Sing Language Robot Arm

Main Article Content

Supattra Plermkamon
Nattapong Saejew
Thawatchai Thanee

Abstract

Sign languages are the important language that have been used in communications for deaf and dump person. Currently, there are rarely people who know how to  do the sign language. However, sign languages have become an essential part of technological revolution. This research presents a new scheme for hand posture selection and recognition based on artificial neural network. Users can interact with this model by using keyboard and mouse in form of off-line operating in order to translate the virtual sign languages for deaf and dump person to communicate friendly.

Virtual sign language robot arm model consists of an animated half human body, two arms and two hands with each five fingers. The virtual sign language robot arm was produced by Solid Works program and using SimMechanics to control the movement of virtual hand gesture/posture. The V-Realm Builder program was used to improve and decorate the display part of this model as reality

The testing result can be divided into 2 cases of hand movement: a single and double hand movement. The right hand side and both hand sides were used for display sign language for single and double hand movement. Respectively, For a single hand movement, it can learn, recognize and classify the alphabet. Vocal and number of Thai language by a two layer feed forward neural network and animate with sound track according to the required word of user. It can display 28 words of sign language very fast and 100% of accuracy, 1.VQ method was used for training. For double hand movement, it can display sign language of verb, pronoun, day and month, and tale with 18 words, 15 words, 19 words and 1 story respectively. It can not learn, recognize and classify word. The time consuming for tale is 180 seconds with 25 signs, The virtual sign language robot arm can accurately display sign language, however, for more general purposes, this model has to be further developed in the filed of on-line communication with human being.

Article Details

How to Cite
Plermkamon, S., Saejew, N., & Thanee, T. (2013). Virtual Sing Language Robot Arm. Engineering and Applied Science Research, 32(5), 701–714. Retrieved from https://ph01.tci-thaijo.org/index.php/easr/article/view/6227
Section
ORIGINAL RESEARCH