Skip to content

PrakashWebDevX/Hand-Gesture-Drawing

Repository files navigation

Next.js TypeScript Tailwind MediaPipe

✦ HandDraw.AI

A premium, gesture-powered HTML5 drawing canvas powered by MediaPipe AI.
Ditch the mouse. Drop the stylus. Draw in thin air with real-time hand tracking right in your browser.


✨ Features

  • 🪄 Invisible Ink to Reality: Uses @mediapipe/tasks-vision to track 21 3D hand landmarks in real time with near-zero latency.
  • 🎨 Infinite Creativity: Choose from 12+ vibrant colors or pick your own custom gradient. Includes 3 dynamic tools (Pen, Soft Brush, Thick Marker).
  • 🤚 Intuitive Gestures:
    • ☝️ One Finger (or Pinch): Draw
    • ✌️ Two Fingers: Hover / Move Cursor
    • 🤟 Three Fingers: Erase Mode
    • 🖐️ Open Palm (Hold): Clear whole canvas
  • 💫 Premium Glassmorphic UI: Deep space themes, animated glowing cursors, smooth SVG-like bezier interpolation, and real-time FPS counters.
  • ↩️ Full Canvas Control: Complete Undo/Redo historical state management (up to 50 strokes memory).
  • 💾 Export & Share: Instantly save your masterpieces directly as .png files to your computer.

⚡ Quick Start (Local Development)

Because HandDraw.AI processes advanced computer vision entirely on the client-side GPU, it requires blazing-fast modern tooling.

  1. Clone the repo

    git clone https://github.com/PrakashWebDevX/Hand-Gesture-Drawing.git
    cd Hand-Gesture-Drawing/hand-gesture-app
  2. Install Dependencies

    npm install
  3. Start the Development Server

    npm run dev

    Your app will be live at http://localhost:3000. Grant the browser permissions to use your webcam, and start drawing!


💻 Keyboard Shortcuts

Speed up your workflow using your other hand on the keyboard:

  • Ctrl + Z : Undo the last stroke
  • Ctrl + Y : Redo
  • Ctrl + S : Save canvas as PNG
  • Backspace / Delete : Quick clear

🚀 One-Click Deployment to Vercel

This app was architected specifically to be deployed as a static/client-side App Router Next.js build on Vercel.

Why Vercel?

Old versions of computer vision apps require heavy Python Flask servers (cv2.VideoCapture) which crash and burn in Serverless Cloud environments. HandDraw.AI fixes this by shipping the MediaPipe machine-learning payload fully to the browser via WebAssembly (WASM).

How to deploy:

  1. Push this repository to your GitHub account.
  2. Go to Vercel.com and click Add New Project.
  3. Import Hand-Gesture-Drawing from GitHub.
  4. CRITICAL: Set the Root Directory to hand-gesture-app and keep the framework as Next.js.
  5. Click Deploy. Vercel will automatically configure the required COEP/COOP headers and present your live app to the web!

🏗️ Architecture

  • Framework: Next.js 15+ (App Router)
  • Computer Vision: Google MediaPipe Tasks Vision (HandLandmarker)
  • Canvas Engine: Native HTML5 <canvas> with Custom Bézier Path Smoothing equations.
  • Styling: Custom CSS + Tailwind CSS utilities in a Dark Glassmorphism aesthetic.

Built with ❤️ by PrakashWebDevX

About

A premium, gesture-powered HTML5 drawing canvas powered by MediaPipe AI. Ditch the mouse. Drop the stylus. Draw in thin air with real-time hand tracking right in your browser.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors