Skip to content

Sidhanth-Kafley/SignAI-Real-Time-Sign-Language-Recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

248 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SignAI: Real-Time Sign Language Recognition

September–December 2020

SignAI is an interactive ASL learning application that uses a Leap Motion device to track hand gestures and a K-Nearest Neighbours (KNN) model to recognize ASL digits (0–9) in real time. The UI adapts tutorial difficulty and pace based on user performance (over 85% accuracy).

Requirements: A Leap Motion device is required to run the main application (PredictGestures.html) and most features of this project. The Leap Motion controller must be connected and set up before use.


Tech Stack

All dependencies are loaded from CDNs in the HTML files; there is no build step or server.

Role Technology
Hand tracking Leap Motion JavaScript SDK v0.6.3
Graphics / canvas p5.js v1.1.9
Numeric arrays numjs
Classifier ml5.js KNNClassifier v0.4.3

Project Structure

Main application

File Purpose
PredictGestures.html Main ASL learning app. Sign-in, real-time gesture recognition with KNN trained on included training data, hand-position feedback (arrows), adaptive tutorial (scaffolding), and quiz.
PredictGestures.js Leap frame handling, 120-D feature extraction per frame, KNN training from train*.js data, X/Y/Z centering, state logic (no hand / uncentered / centered), tutorial and quiz flow.
prepareToDraw.js p5.js setup: canvas, loading ASL digit images, direction arrows, and “place hand” image (from external URLs).

Pages that use Leap Motion

File Purpose
Record.html Record hand data from the Leap: 100 frames, shape [5, 4, 6, numSamples] (5 fingers × 4 bones × 6 coordinates). Logs data to the browser console for copying into training files.
Record.js Leap loop, InteractionBox normalization, bone drawing, frame logging.
Del01.html Demo: circle drawn at index fingertip (Leap + p5).
Del102.html Demo: hand skeleton drawn as lines (Leap + p5).
leapDrawCircle.js Index-finger circle drawing for Del01.
leapDrawLines.js Hand-bone line drawing for Del102.

Pages that do not use Leap Motion

File Purpose
Predict.html KNN demo on the Iris dataset (train/test, 2D circles). Uses ml5 + numjs + p5.
Predict.js Iris KNN training and testing.
Playback.html Renders two pre-recorded hand poses from static arrays in Playback.js.
Playback.js Defines oneFrameOfData and anotherFrameOfData and draws them.

Training data

  • train*.js — Each file defines one global array, shape [5, 4, 6, numSamples]: normalized (x, y, z) for base and tip of each bone per finger per frame. The digit (0–9) is indicated by the number in the filename (e.g. train0He.js → 0, train7Manian.js → 7).
  • PredictGestures.html includes these scripts: train0He, train0ReckordGroten, train1, train2Bongard, train2Sheboy, train2Liu, train3Bongard, train4Bongard, train4Beattie, train4Socia, train4OBrien, train4Makovsky, train5Bongard, train6Bongard, train7Bongard, train7Vega, train7Manian, train8Bongard, train9Bongard. The KNN is trained on all of them when the page loads.

Leap Motion Setup

Before running PredictGestures.html, Record.html, Del01.html, or Del102.html, you need to install and configure the Leap Motion device:

  1. Download and install the Leap Motion SDK V2 from the Leap Motion developer site (under the "V2" section).
    • Windows 10 users: A manual fix may be required as described on the download page. If you encounter issues, you can try skipping the manual fix; the app may still work.
  2. Plug in the controller and run the Leap Motion Visualizer (as shown in the SDK documentation).
  3. Wait for recognition — It may take up to a minute for your computer to recognize the device after it is plugged in.
  4. Verify it works — When you see your hand displayed on the Visualizer screen (either visualization style is fine), the device is ready. The drawn output should correspond to your hand motion.
  5. Note: Leap Motion is not perfect; some hand movements may not be captured correctly. This can affect gesture recognition in the app.

How to Run

To run the main application: You need a Leap Motion device connected to your computer. Open PredictGestures.html in a browser, sign in, center your hand over the Leap, and follow the on-screen digit prompts and quiz.

Other pages:

  • Record.html — Connect the Leap to record hand data; copy the logged array from the browser console into a new train*.js file if you add training data.
  • Del01.html, Del102.html — Connect the Leap to run the finger/skeleton demos.
  • Predict.html, Playback.html — No Leap required; they run with only the browser.

Data and Model

  • Features: Each frame is normalized with the Leap InteractionBox and flattened to 120 values (5 fingers × 4 bones × 6 coordinates).
  • Preprocessing: Before classification, the hand is centered in X, Y, and Z (mean shifted to 0.5).
  • Training: On load, PredictGestures.js iterates over every included train*.js array, reshapes each frame to (1, 120), and adds it to the ml5 KNNClassifier with the label from the filename (0–9).

Presentation

Video presentation for this project:

Watch on YouTube

About

SignAI: Real-Time Sign Language Recognition uses a Leap Motion device and KNN machine learning to recognize ASL gestures. The browser-based app adapts tutorial difficulty based on user performance for an interactive, real-time learning experience.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors