🎥 Real-Time Confidence Detection using Facial & Hand Gesture 🤖👐
Image: Real-time confidence score overlay with face and hand tracking
📍 Project Overview
This project uses MediaPipe to analyze real-time video data from a webcam, detect face and hand landmarks, and compute a confidence score based on facial expressions, head movement, lip motion, and hand gestures. The final confidence score is displayed on the screen, combining data from face and hand analysis.
✨ Features
Face Landmark Detection: Detects key facial landmarks such as eyes, mouth, and head orientation.
Hand Landmark Detection: Tracks the movements of hands and calculates the confidence based on gesture fluidity.
Real-Time Feedback: Displays a confidence percentage in real-time based on the user's facial and hand movements.
Dynamic Updates: Updates the confidence score frame by frame using webcam input.
🛠️ Technologies Used
Technology Description MediaPipe For detecting facial and hand landmarks JavaScript Handles real-time video processing and rendering HTML/CSS Structure and styling of the user interface Canvas API For rendering the face and hand landmarks on video frames
Real-Time Face and Hand Detection
Face and hand landmarks overlaid on a live video stream.
Confidence Calculation
Confidence score displayed in real-time, influenced by facial expressions and hand gestures.
Hand Gestures Tracking
Shows hand gesture detection and movement calculation.
🚀 Getting Started
Follow these steps to run the project locally. Prerequisites
A computer with a working webcam
A web browser that supports JavaScript and MediaPipe
Basic knowledge of HTML, CSS, and JavaScript
Installation Steps:
Clone the Repository:
bash
git clone https://github.com/MorolShohan/Confidence-Detection-Using-Mediapipe-Facemash.git
Navigate to the project directory:
bash
cd confidence-detection
Open the HTML file Simply open confidence detection.html in your preferred browser:
📐 Project Structure
bash
├── confidence.html # Main file
├── style.css # Basic styling for the interface
├── confidence.js # business logic file
├── README.md # Project documentation
📊 How it Works
-
Face and Hand Detection
FaceMesh API from MediaPipe detects and tracks facial landmarks (like eyes, mouth, and nose). Hands API from MediaPipe tracks hand gestures and positions.
-
Confidence Calculation
Smile Detection: Calculates confidence based on the width/height ratio of the mouth. Blink Detection: Tracks blink frequency and adjusts confidence accordingly. Lip Movement: Monitors lip movement for speaking or lip syncing. Hand Gesture Movement: Measures hand movement speed and smoothness.
-
Final Confidence Score
Combines data from face and hand analysis. Displays the confidence score in real-time, allowing dynamic feedback based on movements.
📦 Future Enhancements
Add support for detecting multiple faces.
Improve gesture recognition for more complex hand movements.
Introduce additional facial features like eyebrow movement analysis.
Enhance visualization with more detailed overlays and confidence breakdown.
🧑💻 Contributing
Feel free to submit issues or contribute to the project by creating a pull request. Contributions are welcome!
Fork the repository
Create a new branch (git checkout -b feature/your-feature-name)
Commit your changes (git commit -am 'Add new feature')
Push to the branch (git push origin feature/your-feature-name)
Create a Pull Request
📜 License
This project is licensed under the MIT License. Feel free to use, modify, and distribute this project as per the terms of the license.
💬 Contact
For any questions or suggestions, feel free to reach out:
Email: shohan.aiubcse@gmail.com
GitHub: ShohanMorol
⭐ Acknowledgments
SAZZAD HOSSAIN Sir
ASSISTANT PROFESSOR,
DEPARTMENT OF COMPUTER SCIENCE
Faculty of Science and Technology
American International University-Bangladesh
