A Hand Gesture Recognition System for Deaf-Mute Individuals

α
rugia_said_kamaleldeen
rugia_said_kamaleldeen
σ
Rugia Said Kamaleldeen
Rugia Said Kamaleldeen
ρ
Dr. Ebtihal H.G. Yousif
Dr. Ebtihal H.G. Yousif
α Sudan University of Science and Technology

Send Message

To: Author

A Hand Gesture Recognition System for Deaf-Mute Individuals

Article Fingerprint

ReserarchID

HU7D1

A Hand Gesture Recognition System for Deaf-Mute Individuals Banner

AI TAKEAWAY

Connecting with the Eternal Ground
  • English
  • Afrikaans
  • Albanian
  • Amharic
  • Arabic
  • Armenian
  • Azerbaijani
  • Basque
  • Belarusian
  • Bengali
  • Bosnian
  • Bulgarian
  • Catalan
  • Cebuano
  • Chichewa
  • Chinese (Simplified)
  • Chinese (Traditional)
  • Corsican
  • Croatian
  • Czech
  • Danish
  • Dutch
  • Esperanto
  • Estonian
  • Filipino
  • Finnish
  • French
  • Frisian
  • Galician
  • Georgian
  • German
  • Greek
  • Gujarati
  • Haitian Creole
  • Hausa
  • Hawaiian
  • Hebrew
  • Hindi
  • Hmong
  • Hungarian
  • Icelandic
  • Igbo
  • Indonesian
  • Irish
  • Italian
  • Japanese
  • Javanese
  • Kannada
  • Kazakh
  • Khmer
  • Korean
  • Kurdish (Kurmanji)
  • Kyrgyz
  • Lao
  • Latin
  • Latvian
  • Lithuanian
  • Luxembourgish
  • Macedonian
  • Malagasy
  • Malay
  • Malayalam
  • Maltese
  • Maori
  • Marathi
  • Mongolian
  • Myanmar (Burmese)
  • Nepali
  • Norwegian
  • Pashto
  • Persian
  • Polish
  • Portuguese
  • Punjabi
  • Romanian
  • Russian
  • Samoan
  • Scots Gaelic
  • Serbian
  • Sesotho
  • Shona
  • Sindhi
  • Sinhala
  • Slovak
  • Slovenian
  • Somali
  • Spanish
  • Sundanese
  • Swahili
  • Swedish
  • Tajik
  • Tamil
  • Telugu
  • Thai
  • Turkish
  • Ukrainian
  • Urdu
  • Uzbek
  • Vietnamese
  • Welsh
  • Xhosa
  • Yiddish
  • Yoruba
  • Zulu

Abstract

A deaf-dumb individual always uses gestures to convey his/her ideas to others. However, it is hard for people to understand this gesture language. The purpose of the project is to develop a computer-based system to recognize 26 gestures from American Sign Language (ASL) using MATLAB, which will enable deaf-dumb individuals significantly to communicate with all other people using their natural hand gestures. The proposed system in this project is composed of five modules, which are prepared datasets for ASL which was self-collected using hand gestures from both male and female volunteers, who have alternative ages and skin color in different backgrounds and postures by an ordinary phone camera in total the dataset was 260 images preprocessing, hand segmentation, feature extraction, sign recognition, and text of sign voice conversion. Segmentation is done by converting the image to Hue-Saturation-Value (HSV) format and using color threshold APP. Blob features are extracted by using (BOF) which used the Speed up Robust Features (SURF) algorithm. Furthermore… the K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) algorithms are used for gesture recognition. The Recognized gesture is converted into voice format.

References

17 Cites in Article
  1. Shweta Shinde,Rajesh Autee,Vitthal Bhosale (2016). Real time two way communication approach for hearing impaired and dumb person based on image processing.
  2. J Singha,K Das (2013). Recognition of indian sign language in live video.
  3. K Gautam,A Kaushi (2017). American sign language recognition system using image processing method.
  4. U (2014). National Institute of Health "American Sign Language.
  5. S O'hara,B Draper (2011). Introduction to the bag of features paradigm for image classification and retrieval.
  6. Mr Bhaskar Anand,Prashant,Shah (2016). Face Recognition using SURF Features and Classifier.
  7. M Kakde,A Rawate (2016). Hand gesture recognition system for deaf and dumb people using pca.
  8. Shreyashi Sawant,M Kumbhar (2014). Real time Sign Language Recognition using PCA.
  9. S Jagdish,L Raheja,Sadab (2015). Android based portable hand sign recognition system.
  10. S Mayuresh Keni,A Marathe (2013). Sign language recognition system.
  11. S -R,Mahmoud Zaki,Alaa Abdo,Mahmoud Hamdy,E Saad (2015). Arabic alphabet and numbers sign language recognition.
  12. S Dogic,G Karli (2014). Sign language recognition using neural networks.
  13. A El-Alfi,R Adly,H Ibrahim (2014). Building Real-Time Translation of Arabic Sign Language System using Mechatronic Approach.
  14. Sawant Pramada,D Saylee,N Pranita,N Samiksha,M Vaidya (2013). Intelligent Sign Language Recognition Using Image Processing.
  15. Naoreen (2014). A systematic survey of skin detection algorithms, applications and issues.
  16. Jonathan Allen,M Hunnicutt,Sharon,Dennis Klatt (1987). From Text to Speech: The MITalk system.
  17. R Ashi,A Ameri Introduction to Graphical User Interface (GUI) MATLAB.

Funding

No external funding was declared for this work.

Conflict of Interest

The authors declare no conflict of interest.

Ethical Approval

No ethics committee approval was required for this article type.

Data Availability

Not applicable for this article.

How to Cite This Article

rugia_said_kamaleldeen. 2021. \u201cA Hand Gesture Recognition System for Deaf-Mute Individuals\u201d. Global Journal of Medical Research - K: Interdisciplinary GJMR-K Volume 21 (GJMR Volume 21 Issue K3): .

Download Citation

Journal Specifications

Crossref Journal DOI 10.17406/gjmra

Print ISSN 0975-5888

e-ISSN 2249-4618

Keywords
Classification
GJMR-K Classification: NLMC Code: WV 280
Version of record

v1.2

Issue date

April 24, 2021

Language
en
Experiance in AR

Explore published articles in an immersive Augmented Reality environment. Our platform converts research papers into interactive 3D books, allowing readers to view and interact with content using AR and VR compatible devices.

Read in 3D

Your published article is automatically converted into a realistic 3D book. Flip through pages and read research papers in a more engaging and interactive format.

Article Matrices
Total Views: 1971
Total Downloads: 951
2026 Trends
Related Research

Published Article

A deaf-dumb individual always uses gestures to convey his/her ideas to others. However, it is hard for people to understand this gesture language. The purpose of the project is to develop a computer-based system to recognize 26 gestures from American Sign Language (ASL) using MATLAB, which will enable deaf-dumb individuals significantly to communicate with all other people using their natural hand gestures. The proposed system in this project is composed of five modules, which are prepared datasets for ASL which was self-collected using hand gestures from both male and female volunteers, who have alternative ages and skin color in different backgrounds and postures by an ordinary phone camera in total the dataset was 260 images preprocessing, hand segmentation, feature extraction, sign recognition, and text of sign voice conversion. Segmentation is done by converting the image to Hue-Saturation-Value (HSV) format and using color threshold APP. Blob features are extracted by using (BOF) which used the Speed up Robust Features (SURF) algorithm. Furthermore… the K-Nearest Neighbor (KNN) and Support Vector Machine (SVM) algorithms are used for gesture recognition. The Recognized gesture is converted into voice format.

Our website is actively being updated, and changes may occur frequently. Please clear your browser cache if needed. For feedback or error reporting, please email [email protected]

Request Access

Please fill out the form below to request access to this research paper. Your request will be reviewed by the editorial or author team.
X

Quote and Order Details

Contact Person

Invoice Address

Notes or Comments

This is the heading

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

High-quality academic research articles on global topics and journals.

A Hand Gesture Recognition System for Deaf-Mute Individuals

Rugia Said Kamaleldeen
Rugia Said Kamaleldeen
Dr. Ebtihal H.G. Yousif
Dr. Ebtihal H.G. Yousif

Research Journals