Gesture To Speech Conversion using Flex sensors, MPU6050 and Python
Vaibhav Mehra1, Aakash Choudhury2, Rishu Ranjan Choubey3
1Vaibhav Mehra*, Electrical and Electronics Engineering, Vellore Institute of Technology, Vellore, India.
2Aakash Choudhury, Electrical and Electronics Engineering, Vellore Institute of Technology, Vellore, India.
3Rishu Ranjan Choubey, Electrical and Electronics Engineering, Vellore Institute of Technology, Vellore, India.
Manuscript received on July 11, 2019. | Revised Manuscript received on August 20, 2019. | Manuscript published on August 30, 2019. | PP: 4686-4690 | Volume-8 Issue-6, August 2019. | Retrieval Number: F9167088619/2019©BEIESP | DOI: 10.35940/ijeat.F9167.088619
Open Access | Ethics and Policies | Cite | Mendeley
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)
Abstract: Communicating through hand gestures is one of the most common forms of non-verbal and visual communication adopted by speech impaired population all around the world. The problem existing at the moment is that most of the people are not able to comprehend hand gestures or convert them to the spoken language quick enough for the listener to understand. A large fraction of India’s population is speech impaired. In addition to this communication to sign language is not a very easy task. This problem demands a better solution which can assist speech impaired population conversation without any difficulties. As a result, reducing the communication gap for the speech impaired. This paper proposes an idea which will assist in removing or at least reducing this gap between the speech impaired and normal people. The research going on this area mostly focuses on image processing approaches. However, a cheaper and user-friendly approach has been used in this paper. The idea is to make a glove that can be worn by the speech impaired people which will further be used to convert the sign language into speech and text. Our prototype involves Arduino Uno as a microcontroller which is interfaced with flex sensors and accelerometer, gyroscopic sensor for reading the hand gestures. Furthermore, to perform better execution, we have incorporated an algorithm for better interpretation of data and therefore producing more accurate result. Thereafter, we use python to interface Arduino Uno with a microprocessor and finally converting into speech. The prototype has been calibrated in accordance with the ASL (American Sign Language).
Keywords: Arduino Uno, Flex sensors, MPU6050, Python, Text to speech.