Skip to content

arhamgarg/gesture-recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 

Repository files navigation

Virtual Sign Language Interpreter

Scenario: Virtual Sign Language Interpreter

Designation: Gesture Recognition Specialist

Task: Create a hand gesture recognition model that detects specific alphabet gestures (eg. B, A, P, Y) for use in a virtual sign language interpreter. This tool will assist users in communicating through hand gestures. Using the MPU6050 sensor, collect data to classify these letters accurately. The MPU6050 is a 6-axis motion tracking device capable of detecting changes in motion, acceleration, and rotation. Deploy the model on an ESP32 device. The solution must be original, effect, and deployable, built completely during the hackathon.

Ground Rules:

1. A minimum of four different entities must be classified.

2. Competitors are free to get as creative as they can to modify their problem statements, given that they inform the same to any of the event organisers.

3. Criteria: accuracy of the model, output, creativity.

4. The model must be able to classify the data in real time.

About

Project for Edge Fusion Showdown at Anokha 2024

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published