Skip to content

It is a Hand Gesture based cursor controller. The cursor moves as we move our hand in front of the Primary Camera. It uses Machine Learning to infer 21 3-Dimensional Landmarks of hand using Mediapipe's State-of-the-art techniques.

Notifications You must be signed in to change notification settings

UtkarshPrajapati/Gesture-Controlled-Mouse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Gesture Controlled Mouse

 

Mouse Controller using Hand Gesture

Github Top Language Github Language Count Repository Size

About   |   How To Use   |   Features   |   Technologies   |   Requirements   |   Starting   |   Made By   |   Author


🎯 About

I have made a Hand Gesture based cursor controller. The cursor moves as we move our hand in front of the Primary Camera. It uses Machine Learning to infer 21 3-Dimensional Landmarks of hand using Mediapipe's State-of-the-art techniques.

🎯 How To Use

Basic Interface:-

Basic Interface

Moving Cursor with Index Finger Movement:-

Moving Cursor with Index Finger Movement

Clicking on Screen(By touching Thumb & Little Finger):-

Clicking on Screen

✨ Features

✔️ Uses Open Computer Vision (OpenCV)
✔️ Tracks hands and Finger Tips Efficiently
✔️ Moves the cursor according to the Hand Gestures & Placement of Index Finger in the Camera Frame.
✔️ Can click the screen when Thumb & Little Finger are touched with each other.

🚀 Technologies

The following tools were used in this project:

✅ Requirements

Before starting, you need to have Git & basic Deep Learning libraries installed.

🏁 Starting

# Clone this project
$ git clone https://github.com/UtkarshPrajapati/Gesture-Controlled-Mouse.git

# Access
$ cd Gesture-Controlled-Mouse

# Install dependencies
$ pip install -r requirements.txt

# Run the project
$ jupyter nbconvert --execute Mouse Controller.ipynb

📝 Made By

Made with ❤️ by Utkarsh Prajapati

 

Back to top

About

It is a Hand Gesture based cursor controller. The cursor moves as we move our hand in front of the Primary Camera. It uses Machine Learning to infer 21 3-Dimensional Landmarks of hand using Mediapipe's State-of-the-art techniques.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published