Scenario: Virtual Sign Language Interpreter
Designation: Gesture Recognition Specialist
Task: Create a hand gesture recognition model that detects specific alphabet gestures (eg. B, A, P, Y) for use in a virtual sign language interpreter. This tool will assist users in communicating through hand gestures. Using the MPU6050 sensor, collect data to classify these letters accurately. The MPU6050 is a 6-axis motion tracking device capable of detecting changes in motion, acceleration, and rotation. Deploy the model on an ESP32 device. The solution must be original, effect, and deployable, built completely during the hackathon.
Ground Rules:
1. A minimum of four different entities must be classified.
2. Competitors are free to get as creative as they can to modify their problem statements, given that they inform the same to any of the event organisers.
3. Criteria: accuracy of the model, output, creativity.
4. The model must be able to classify the data in real time.