A premium, gesture-powered HTML5 drawing canvas powered by MediaPipe AI.
Ditch the mouse. Drop the stylus. Draw in thin air with real-time hand tracking right in your browser.
- 🪄 Invisible Ink to Reality: Uses
@mediapipe/tasks-visionto track 21 3D hand landmarks in real time with near-zero latency. - 🎨 Infinite Creativity: Choose from 12+ vibrant colors or pick your own custom gradient. Includes 3 dynamic tools (
Pen,Soft Brush,Thick Marker). - 🤚 Intuitive Gestures:
☝️ One Finger (or Pinch):Draw✌️ Two Fingers:Hover / Move Cursor🤟 Three Fingers:Erase Mode🖐️ Open Palm (Hold):Clear whole canvas
- 💫 Premium Glassmorphic UI: Deep space themes, animated glowing cursors, smooth SVG-like bezier interpolation, and real-time FPS counters.
- ↩️ Full Canvas Control: Complete Undo/Redo historical state management (up to 50 strokes memory).
- 💾 Export & Share: Instantly save your masterpieces directly as
.pngfiles to your computer.
Because HandDraw.AI processes advanced computer vision entirely on the client-side GPU, it requires blazing-fast modern tooling.
-
Clone the repo
git clone https://github.com/PrakashWebDevX/Hand-Gesture-Drawing.git cd Hand-Gesture-Drawing/hand-gesture-app -
Install Dependencies
npm install
-
Start the Development Server
npm run dev
Your app will be live at
http://localhost:3000. Grant the browser permissions to use your webcam, and start drawing!
Speed up your workflow using your other hand on the keyboard:
Ctrl + Z: Undo the last strokeCtrl + Y: RedoCtrl + S: Save canvas as PNGBackspace/Delete: Quick clear
This app was architected specifically to be deployed as a static/client-side App Router Next.js build on Vercel.
Old versions of computer vision apps require heavy Python Flask servers (cv2.VideoCapture) which crash and burn in Serverless Cloud environments. HandDraw.AI fixes this by shipping the MediaPipe machine-learning payload fully to the browser via WebAssembly (WASM).
- Push this repository to your GitHub account.
- Go to Vercel.com and click Add New Project.
- Import
Hand-Gesture-Drawingfrom GitHub. - CRITICAL: Set the Root Directory to
hand-gesture-appand keep the framework as Next.js. - Click Deploy. Vercel will automatically configure the required COEP/COOP headers and present your live app to the web!
- Framework: Next.js 15+ (App Router)
- Computer Vision: Google MediaPipe Tasks Vision (
HandLandmarker) - Canvas Engine: Native HTML5
<canvas>with Custom Bézier Path Smoothing equations. - Styling: Custom CSS + Tailwind CSS utilities in a Dark Glassmorphism aesthetic.
Built with ❤️ by PrakashWebDevX