Skip to content

Developed a gesture-based communication system to interpret human gestures and provide relevant outputs, assisting visually and speech-impaired individuals in effective communication. Utilized Python, TensorFlow, MediaPipe, OpenCV, and Holistic pipelines for accurate gesture recognition and real-time processing.

Notifications You must be signed in to change notification settings

Gupta-4388/Sciphit-Hackathon-Project

Repository files navigation

Sciphit-Hackathon-Project ✋🤖

A gesture-based communication system designed to interpret human gestures and provide relevant outputs, assisting visually and speech-impaired individuals in effective communication.

This project was developed as part of the Sciphit Hackathon and leverages state-of-the-art computer vision and deep learning tools for real-time gesture recognition.


✨ Features

  • Real-time hand gesture recognition using MediaPipe Holistic pipelines.
  • Accurate classification with TensorFlow deep learning models.
  • Computer vision powered by OpenCV for live video stream processing.
  • Accessibility-focused: enables gesture-to-communication for visually and speech-impaired users.
  • Scalable system for future integration with assistive devices.

🛠️ Tech Stack

  • Python 🐍
  • TensorFlow 🔥
  • MediaPipe 🎯
  • OpenCV 👀
  • Holistic Pipelines

🚀 Installation & Setup

  1. Clone this repository:

    git clone https://github.com/Gupta-4388/Sciphit-Hackathon-Project.git
    cd Sciphit-Hackathon-Project
  2. Create a virtual environment (optional but recommended):

    # On Linux / Mac
    python -m venv venv
    source venv/bin/activate
    
    # On Windows
    python -m venv venv
    venv\Scripts\activate
    
  3. Install dependencies:

    pip install -r requirements.txt
  4. Run the application:

    python main.py
    

📂 Project Structure

Sciphit-Hackathon-Project/ │── main.py # Entry point for running the system │── models/ # Trained ML/DL models │── data/ # Dataset (if included or linked) │── utils/ # Helper scripts │── requirements.txt # Dependencies │── README.md # Project documentation

🎯 Use Cases

1.) Helps visually and speech-impaired individuals communicate.

2.) Can be extended to sign language recognition systems.

3.) Useful in human-computer interaction (HCI) applications.

📸 Demo

image

🏆 Hackathon

This project was built as part of the Sciphit Hackathon, showcasing AI-powered assistive technology to make communication more inclusive.

🤝 Contributing

Contributions are welcome! Feel free to fork this repo, open issues, and submit pull requests.

About

Developed a gesture-based communication system to interpret human gestures and provide relevant outputs, assisting visually and speech-impaired individuals in effective communication. Utilized Python, TensorFlow, MediaPipe, OpenCV, and Holistic pipelines for accurate gesture recognition and real-time processing.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages