A sign language interpreter using live video feed from the camera.
-
Updated
Nov 7, 2024 - Python
A sign language interpreter using live video feed from the camera.
Sign Language Gesture Recognition From Video Sequences Using RNN And CNN
Simple sign language alphabet recognizer using Python, openCV and tensorflow for training Inception model (CNN classifier).
This repo contains the official code of our work SAM-SLR which won the CVPR 2021 Challenge on Large Scale Signer Independent Isolated Sign Language Recognition.
isolated & continuous sign language recognition using CNN+LSTM/3D CNN/GCN/Encoder-Decoder
Indian Sign language Recognition using OpenCV
Real-time Recognition of german sign language (DGS) with MediaPipe
Simple Sign Language Detector
A computer vision based gesture detection system that automatically detects the number of fingers as a hand gesture and enables you to control simple button pressing games using you hand gestures.
Android application which uses feature extraction algorithms and machine learning (SVM) to recognise and translate static sign language gestures.
A simple sign language detection web app built using Next.js and Tensorflow.js. 2020 Congressional App Challenge. Winner! Developed by Mahesh Natamai and Arjun Vikram.
Sign Language Alphabet Detection and Recognition using YOLOv8
Sign Language Translator enables the hearing impaired user to communicate efficiently in sign language, and the application will translate the same into text/speech. The user has to train the model, by recording its own sign language gestures. Internally it uses MobileNet and KNN classifier to classify the gestures.
🙌 A collection of awesome Sign Language projects and resources 🤟
A Sign Language Learning Platform where who know sign language can come and practice Sign Language and also people who don't know can learn through this
SAM-SLR-v2 is an improved version of SAM-SLR for sign language recognition.
Real-time Sign Language Gesture Recognition Using 1DCNN + Transformers on MediaPipe landmarks
We help the deaf and the dumb to communicate with normal people using hand gesture to speech conversion. In this code we use depth maps from the kinect camera and techniques like convex hull + contour mapping to recognise 5 hand signs
Easy_sign is an open source russian sign language recognition project that uses small CPU model for predictions and is designed for easy deployment via Streamlit.
Add a description, image, and links to the sign-language-recognition-system topic page so that developers can more easily learn about it.
To associate your repository with the sign-language-recognition-system topic, visit your repo's landing page and select "manage topics."