resultdemo.mp4
This project is designed to detect trash-throwing actions using computer vision techniques. It utilizes YOLOv8 for pose estimation to identify specific human actions related to throwing trash.
- Pose Estimation: Uses YOLOv8 for detecting human poses.
- Action Detection: Identifies the action of throwing trash.
- Result Visualization: Provides visual outputs of the detection process.
-
Clone the repository:
git clone https://github.com/bhaskarshukla002/Trash-Throwing-Action-Detection.git
-
Navigate to the project directory:
cd Trash-Throwing-Action-Detection
-
Install the required dependencies:
pip install opencv-python numpy torch ultralytics
-
Jupyter Notebook:
- Open main.ipynb to run the detection process step by step.
-
Python Script:
- Run the detection script: python python_implementation.py
- main.ipynb: Jupyter Notebook for interactive development and testing.
- python_implementation.py: Python script for action detection.
- result images/: Directory containing result images.
- videos/: Directory containing input video files.
- resultdemo.mp4: Demo video showcasing the detection results.
- yolov8n-pose.pt: Pre-trained model weights for YOLOv8 pose estimation.
Contributions are welcome! Please submit a pull request or open an issue to discuss any changes.
This project is licensed under the MIT License.
For any inquiries, please contact the repository owner at [email protected].
For more details, visit the repository.