|
|
|
@ -1,208 +1,210 @@ |
|
|
|
|
{ |
|
|
|
|
"nbformat": 4, |
|
|
|
|
"nbformat_minor": 0, |
|
|
|
|
"metadata": { |
|
|
|
|
"colab": { |
|
|
|
|
"provenance": [], |
|
|
|
|
"gpuType": "T4" |
|
|
|
|
}, |
|
|
|
|
"kernelspec": { |
|
|
|
|
"name": "python3", |
|
|
|
|
"display_name": "Python 3" |
|
|
|
|
}, |
|
|
|
|
"language_info": { |
|
|
|
|
"name": "python" |
|
|
|
|
}, |
|
|
|
|
"accelerator": "GPU" |
|
|
|
|
"cells": [ |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "PN1cAxdvd61e" |
|
|
|
|
}, |
|
|
|
|
"source": [ |
|
|
|
|
"<div align=\"center\">\n", |
|
|
|
|
"\n", |
|
|
|
|
" <a href=\"https://ultralytics.com/yolov8\" target=\"_blank\">\n", |
|
|
|
|
" <img width=\"1024\", src=\"https://raw.githubusercontent.com/ultralytics/assets/main/yolov8/banner-yolov8.png\"></a>\n", |
|
|
|
|
"\n", |
|
|
|
|
" [中文](https://docs.ultralytics.com/zh/) | [한국어](https://docs.ultralytics.com/ko/) | [日本語](https://docs.ultralytics.com/ja/) | [Русский](https://docs.ultralytics.com/ru/) | [Deutsch](https://docs.ultralytics.com/de/) | [Français](https://docs.ultralytics.com/fr/) | [Español](https://docs.ultralytics.com/es/) | [Português](https://docs.ultralytics.com/pt/) | [Türkçe](https://docs.ultralytics.com/tr/) | [Tiếng Việt](https://docs.ultralytics.com/vi/) | [हिन्दी](https://docs.ultralytics.com/hi/) | [العربية](https://docs.ultralytics.com/ar/)\n", |
|
|
|
|
"\n", |
|
|
|
|
" <a href=\"https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml\"><img src=\"https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml/badge.svg\" alt=\"Ultralytics CI\"></a>\n", |
|
|
|
|
" <a href=\"https://console.paperspace.com/github/ultralytics/ultralytics\"><img src=\"https://assets.paperspace.io/img/gradient-badge.svg\" alt=\"Run on Gradient\"/></a>\n", |
|
|
|
|
" <a href=\"https://colab.research.google.com/github/ultralytics/ultralytics/blob/main/examples/object_counting.ipynb\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"></a>\n", |
|
|
|
|
" <a href=\"https://www.kaggle.com/ultralytics/yolov8\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" alt=\"Open In Kaggle\"></a>\n", |
|
|
|
|
" <a href=\"https://ultralytics.com/discord\"><img alt=\"Discord\" src=\"https://img.shields.io/discord/1089800235347353640?logo=discord&logoColor=white&label=Discord&color=blue\"></a>\n", |
|
|
|
|
"\n", |
|
|
|
|
"Welcome to the Ultralytics YOLOv8 🚀 notebook! <a href=\"https://github.com/ultralytics/ultralytics\">YOLOv8</a> is the latest version of the YOLO (You Only Look Once) AI models developed by <a href=\"https://ultralytics.com\">Ultralytics</a>. This notebook serves as the starting point for exploring the various resources available to help you get started with YOLOv8 and understand its features and capabilities.\n", |
|
|
|
|
"\n", |
|
|
|
|
"YOLOv8 models are fast, accurate, and easy to use, making them ideal for various object detection and image segmentation tasks. They can be trained on large datasets and run on diverse hardware platforms, from CPUs to GPUs.\n", |
|
|
|
|
"\n", |
|
|
|
|
"We hope that the resources in this notebook will help you get the most out of YOLOv8. Please browse the YOLOv8 <a href=\"https://docs.ultralytics.com/guides/object-counting/\"> Object Counting Docs</a> for details, raise an issue on <a href=\"https://github.com/ultralytics/ultralytics\">GitHub</a> for support, and join our <a href=\"https://ultralytics.com/discord\">Discord</a> community for questions and discussions!\n", |
|
|
|
|
"\n", |
|
|
|
|
"</div>" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
"cells": [ |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"source": [ |
|
|
|
|
"<div align=\"center\">\n", |
|
|
|
|
"\n", |
|
|
|
|
" <a href=\"https://ultralytics.com/yolov8\" target=\"_blank\">\n", |
|
|
|
|
" <img width=\"1024\", src=\"https://raw.githubusercontent.com/ultralytics/assets/main/yolov8/banner-yolov8.png\"></a>\n", |
|
|
|
|
"\n", |
|
|
|
|
" [中文](https://docs.ultralytics.com/zh/) | [한국어](https://docs.ultralytics.com/ko/) | [日本語](https://docs.ultralytics.com/ja/) | [Русский](https://docs.ultralytics.com/ru/) | [Deutsch](https://docs.ultralytics.com/de/) | [Français](https://docs.ultralytics.com/fr/) | [Español](https://docs.ultralytics.com/es/) | [Português](https://docs.ultralytics.com/pt/) | [Türkçe](https://docs.ultralytics.com/tr/) | [Tiếng Việt](https://docs.ultralytics.com/vi/) | [हिन्दी](https://docs.ultralytics.com/hi/) | [العربية](https://docs.ultralytics.com/ar/)\n", |
|
|
|
|
"\n", |
|
|
|
|
" <a href=\"https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml\"><img src=\"https://github.com/ultralytics/ultralytics/actions/workflows/ci.yaml/badge.svg\" alt=\"Ultralytics CI\"></a>\n", |
|
|
|
|
" <a href=\"https://console.paperspace.com/github/ultralytics/ultralytics\"><img src=\"https://assets.paperspace.io/img/gradient-badge.svg\" alt=\"Run on Gradient\"/></a>\n", |
|
|
|
|
" <a href=\"https://colab.research.google.com/github/ultralytics/ultralytics/blob/main/examples/object_counting.ipynb\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"></a>\n", |
|
|
|
|
" <a href=\"https://www.kaggle.com/ultralytics/yolov8\"><img src=\"https://kaggle.com/static/images/open-in-kaggle.svg\" alt=\"Open In Kaggle\"></a>\n", |
|
|
|
|
" <a href=\"https://ultralytics.com/discord\"><img alt=\"Discord\" src=\"https://img.shields.io/discord/1089800235347353640?logo=discord&logoColor=white&label=Discord&color=blue\"></a>\n", |
|
|
|
|
"\n", |
|
|
|
|
"Welcome to the Ultralytics YOLOv8 🚀 notebook! <a href=\"https://github.com/ultralytics/ultralytics\">YOLOv8</a> is the latest version of the YOLO (You Only Look Once) AI models developed by <a href=\"https://ultralytics.com\">Ultralytics</a>. This notebook serves as the starting point for exploring the various resources available to help you get started with YOLOv8 and understand its features and capabilities.\n", |
|
|
|
|
"\n", |
|
|
|
|
"YOLOv8 models are fast, accurate, and easy to use, making them ideal for various object detection and image segmentation tasks. They can be trained on large datasets and run on diverse hardware platforms, from CPUs to GPUs.\n", |
|
|
|
|
"\n", |
|
|
|
|
"We hope that the resources in this notebook will help you get the most out of YOLOv8. Please browse the YOLOv8 <a href=\"https://docs.ultralytics.com/guides/object-counting/\"> Object Counting Docs</a> for details, raise an issue on <a href=\"https://github.com/ultralytics/ultralytics\">GitHub</a> for support, and join our <a href=\"https://ultralytics.com/discord\">Discord</a> community for questions and discussions!\n", |
|
|
|
|
"\n", |
|
|
|
|
"</div>" |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "PN1cAxdvd61e" |
|
|
|
|
} |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"source": [ |
|
|
|
|
"# Setup\n", |
|
|
|
|
"\n", |
|
|
|
|
"Pip install `ultralytics` and [dependencies](https://github.com/ultralytics/ultralytics/blob/main/pyproject.toml) and check software and hardware.\n", |
|
|
|
|
"\n", |
|
|
|
|
"[![PyPI - Version](https://img.shields.io/pypi/v/ultralytics?logo=pypi&logoColor=white)](https://pypi.org/project/ultralytics/) [![Downloads](https://static.pepy.tech/badge/ultralytics)](https://pepy.tech/project/ultralytics) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ultralytics?logo=python&logoColor=gold)](https://pypi.org/project/ultralytics/)" |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "o68Sg1oOeZm2" |
|
|
|
|
} |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "code", |
|
|
|
|
"execution_count": 1, |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "9dSwz_uOReMI", |
|
|
|
|
"outputId": "fd3bab88-2f25-46c0-cae9-04d2beedc0c1", |
|
|
|
|
"colab": { |
|
|
|
|
"base_uri": "https://localhost:8080/" |
|
|
|
|
} |
|
|
|
|
}, |
|
|
|
|
"outputs": [ |
|
|
|
|
{ |
|
|
|
|
"output_type": "stream", |
|
|
|
|
"name": "stdout", |
|
|
|
|
"text": [ |
|
|
|
|
"Ultralytics YOLOv8.2.18 🚀 Python-3.10.12 torch-2.2.1+cu121 CUDA:0 (Tesla T4, 15102MiB)\n", |
|
|
|
|
"Setup complete ✅ (2 CPUs, 12.7 GB RAM, 29.8/78.2 GB disk)\n" |
|
|
|
|
] |
|
|
|
|
} |
|
|
|
|
], |
|
|
|
|
"source": [ |
|
|
|
|
"%pip install ultralytics\n", |
|
|
|
|
"import ultralytics\n", |
|
|
|
|
"ultralytics.checks()" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"source": [ |
|
|
|
|
"# Object Counting using Ultralytics YOLOv8 🚀\n", |
|
|
|
|
"\n", |
|
|
|
|
"## What is Object Counting?\n", |
|
|
|
|
"\n", |
|
|
|
|
"Object counting with [Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics/) involves accurate identification and counting of specific objects in videos and camera streams. YOLOv8 excels in real-time applications, providing efficient and precise object counting for various scenarios like crowd analysis and surveillance, thanks to its state-of-the-art algorithms and deep learning capabilities.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Advantages of Object Counting?\n", |
|
|
|
|
"\n", |
|
|
|
|
"- **Resource Optimization:** Object counting facilitates efficient resource management by providing accurate counts, and optimizing resource allocation in applications like inventory management.\n", |
|
|
|
|
"- **Enhanced Security:** Object counting enhances security and surveillance by accurately tracking and counting entities, aiding in proactive threat detection.\n", |
|
|
|
|
"- **Informed Decision-Making:** Object counting offers valuable insights for decision-making, optimizing processes in retail, traffic management, and various other domains.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Real World Applications\n", |
|
|
|
|
"\n", |
|
|
|
|
"| Logistics | Aquaculture |\n", |
|
|
|
|
"|:-------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------:|\n", |
|
|
|
|
"| ![Conveyor Belt Packets Counting Using Ultralytics YOLOv8](https://github.com/RizwanMunawar/ultralytics/assets/62513924/70e2d106-510c-4c6c-a57a-d34a765aa757) | ![Fish Counting in Sea using Ultralytics YOLOv8](https://github.com/RizwanMunawar/ultralytics/assets/62513924/c60d047b-3837-435f-8d29-bb9fc95d2191) |\n", |
|
|
|
|
"| Conveyor Belt Packets Counting Using Ultralytics YOLOv8 | Fish Counting in Sea using Ultralytics YOLOv8 |\n" |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "m7VkxQ2aeg7k" |
|
|
|
|
} |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "code", |
|
|
|
|
"source": [ |
|
|
|
|
"import cv2\n", |
|
|
|
|
"from ultralytics import YOLO, solutions\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Load the pre-trained YOLOv8 model\n", |
|
|
|
|
"model = YOLO(\"yolov8n.pt\")\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Open the video file\n", |
|
|
|
|
"cap = cv2.VideoCapture(\"path/to/video/file.mp4\")\n", |
|
|
|
|
"assert cap.isOpened(), \"Error reading video file\"\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Get video properties: width, height, and frames per second (fps)\n", |
|
|
|
|
"w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Define points for a line or region of interest in the video frame\n", |
|
|
|
|
"line_points = [(20, 400), (1080, 400)] # Line coordinates\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Specify classes to count, for example: person (0) and car (2)\n", |
|
|
|
|
"classes_to_count = [0, 2] # Class IDs for person and car\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Initialize the video writer to save the output video\n", |
|
|
|
|
"video_writer = cv2.VideoWriter(\"object_counting_output.avi\", cv2.VideoWriter_fourcc(*\"mp4v\"), fps, (w, h))\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Initialize the Object Counter with visualization options and other parameters\n", |
|
|
|
|
"counter = solutions.ObjectCounter(\n", |
|
|
|
|
" view_img=True, # Display the image during processing\n", |
|
|
|
|
" reg_pts=line_points, # Region of interest points\n", |
|
|
|
|
" classes_names=model.names, # Class names from the YOLO model\n", |
|
|
|
|
" draw_tracks=True, # Draw tracking lines for objects\n", |
|
|
|
|
" line_thickness=2, # Thickness of the lines drawn\n", |
|
|
|
|
")\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Process video frames in a loop\n", |
|
|
|
|
"while cap.isOpened():\n", |
|
|
|
|
" success, im0 = cap.read()\n", |
|
|
|
|
" if not success:\n", |
|
|
|
|
" print(\"Video frame is empty or video processing has been successfully completed.\")\n", |
|
|
|
|
" break\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Perform object tracking on the current frame, filtering by specified classes\n", |
|
|
|
|
" tracks = model.track(im0, persist=True, show=False, classes=classes_to_count)\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Use the Object Counter to count objects in the frame and get the annotated image\n", |
|
|
|
|
" im0 = counter.start_counting(im0, tracks)\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Write the annotated frame to the output video\n", |
|
|
|
|
" video_writer.write(im0)\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Release the video capture and writer objects\n", |
|
|
|
|
"cap.release()\n", |
|
|
|
|
"video_writer.release()\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Close all OpenCV windows\n", |
|
|
|
|
"cv2.destroyAllWindows()" |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "Cx-u59HQdu2o" |
|
|
|
|
}, |
|
|
|
|
"execution_count": null, |
|
|
|
|
"outputs": [] |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "o68Sg1oOeZm2" |
|
|
|
|
}, |
|
|
|
|
"source": [ |
|
|
|
|
"# Setup\n", |
|
|
|
|
"\n", |
|
|
|
|
"Pip install `ultralytics` and [dependencies](https://github.com/ultralytics/ultralytics/blob/main/pyproject.toml) and check software and hardware.\n", |
|
|
|
|
"\n", |
|
|
|
|
"[![PyPI - Version](https://img.shields.io/pypi/v/ultralytics?logo=pypi&logoColor=white)](https://pypi.org/project/ultralytics/) [![Downloads](https://static.pepy.tech/badge/ultralytics)](https://pepy.tech/project/ultralytics) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/ultralytics?logo=python&logoColor=gold)](https://pypi.org/project/ultralytics/)" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "code", |
|
|
|
|
"execution_count": 1, |
|
|
|
|
"metadata": { |
|
|
|
|
"colab": { |
|
|
|
|
"base_uri": "https://localhost:8080/" |
|
|
|
|
}, |
|
|
|
|
"id": "9dSwz_uOReMI", |
|
|
|
|
"outputId": "fd3bab88-2f25-46c0-cae9-04d2beedc0c1" |
|
|
|
|
}, |
|
|
|
|
"outputs": [ |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"source": [ |
|
|
|
|
"# Additional Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Community Support\n", |
|
|
|
|
"\n", |
|
|
|
|
"For more information on counting objects with Ultralytics, you can explore the comprehensive [Ultralytics Object Counting Docs](https://docs.ultralytics.com/guides/object-counting/). This guide covers everything from basic concepts to advanced techniques, ensuring you get the most out of counting and visualization.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Ultralytics ⚡ Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"At Ultralytics, we are committed to providing cutting-edge AI solutions. Here are some key resources to learn more about our company and get involved with our community:\n", |
|
|
|
|
"\n", |
|
|
|
|
"- [Ultralytics HUB](https://ultralytics.com/hub): Simplify your AI projects with Ultralytics HUB, our no-code tool for effortless YOLO training and deployment.\n", |
|
|
|
|
"- [Ultralytics Licensing](https://ultralytics.com/license): Review our licensing terms to understand how you can use our software in your projects.\n", |
|
|
|
|
"- [About Us](https://ultralytics.com/about): Discover our mission, vision, and the story behind Ultralytics.\n", |
|
|
|
|
"- [Join Our Team](https://ultralytics.com/work): Explore career opportunities and join our team of talented professionals.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## YOLOv8 🚀 Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"YOLOv8 is the latest evolution in the YOLO series, offering state-of-the-art performance in object detection and image segmentation. Here are some essential resources to help you get started with YOLOv8:\n", |
|
|
|
|
"\n", |
|
|
|
|
"- [GitHub](https://github.com/ultralytics/ultralytics): Access the YOLOv8 repository on GitHub, where you can find the source code, contribute to the project, and report issues.\n", |
|
|
|
|
"- [Docs](https://docs.ultralytics.com/): Explore the official documentation for YOLOv8, including installation guides, tutorials, and detailed API references.\n", |
|
|
|
|
"- [Discord](https://ultralytics.com/discord): Join our Discord community to connect with other users, share your projects, and get help from the Ultralytics team.\n", |
|
|
|
|
"\n", |
|
|
|
|
"These resources are designed to help you leverage the full potential of Ultralytics' offerings and YOLOv8. Whether you're a beginner or an experienced developer, you'll find the information and support you need to succeed." |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "QrlKg-y3fEyD" |
|
|
|
|
} |
|
|
|
|
"name": "stdout", |
|
|
|
|
"output_type": "stream", |
|
|
|
|
"text": [ |
|
|
|
|
"Ultralytics YOLOv8.2.18 🚀 Python-3.10.12 torch-2.2.1+cu121 CUDA:0 (Tesla T4, 15102MiB)\n", |
|
|
|
|
"Setup complete ✅ (2 CPUs, 12.7 GB RAM, 29.8/78.2 GB disk)\n" |
|
|
|
|
] |
|
|
|
|
} |
|
|
|
|
] |
|
|
|
|
], |
|
|
|
|
"source": [ |
|
|
|
|
"%pip install ultralytics\n", |
|
|
|
|
"import ultralytics\n", |
|
|
|
|
"\n", |
|
|
|
|
"ultralytics.checks()" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "m7VkxQ2aeg7k" |
|
|
|
|
}, |
|
|
|
|
"source": [ |
|
|
|
|
"# Object Counting using Ultralytics YOLOv8 🚀\n", |
|
|
|
|
"\n", |
|
|
|
|
"## What is Object Counting?\n", |
|
|
|
|
"\n", |
|
|
|
|
"Object counting with [Ultralytics YOLOv8](https://github.com/ultralytics/ultralytics/) involves accurate identification and counting of specific objects in videos and camera streams. YOLOv8 excels in real-time applications, providing efficient and precise object counting for various scenarios like crowd analysis and surveillance, thanks to its state-of-the-art algorithms and deep learning capabilities.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Advantages of Object Counting?\n", |
|
|
|
|
"\n", |
|
|
|
|
"- **Resource Optimization:** Object counting facilitates efficient resource management by providing accurate counts, and optimizing resource allocation in applications like inventory management.\n", |
|
|
|
|
"- **Enhanced Security:** Object counting enhances security and surveillance by accurately tracking and counting entities, aiding in proactive threat detection.\n", |
|
|
|
|
"- **Informed Decision-Making:** Object counting offers valuable insights for decision-making, optimizing processes in retail, traffic management, and various other domains.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Real World Applications\n", |
|
|
|
|
"\n", |
|
|
|
|
"| Logistics | Aquaculture |\n", |
|
|
|
|
"|:-------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------:|\n", |
|
|
|
|
"| ![Conveyor Belt Packets Counting Using Ultralytics YOLOv8](https://github.com/RizwanMunawar/ultralytics/assets/62513924/70e2d106-510c-4c6c-a57a-d34a765aa757) | ![Fish Counting in Sea using Ultralytics YOLOv8](https://github.com/RizwanMunawar/ultralytics/assets/62513924/c60d047b-3837-435f-8d29-bb9fc95d2191) |\n", |
|
|
|
|
"| Conveyor Belt Packets Counting Using Ultralytics YOLOv8 | Fish Counting in Sea using Ultralytics YOLOv8 |\n" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "code", |
|
|
|
|
"execution_count": null, |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "Cx-u59HQdu2o" |
|
|
|
|
}, |
|
|
|
|
"outputs": [], |
|
|
|
|
"source": [ |
|
|
|
|
"import cv2\n", |
|
|
|
|
"\n", |
|
|
|
|
"from ultralytics import YOLO, solutions\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Load the pre-trained YOLOv8 model\n", |
|
|
|
|
"model = YOLO(\"yolov8n.pt\")\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Open the video file\n", |
|
|
|
|
"cap = cv2.VideoCapture(\"path/to/video/file.mp4\")\n", |
|
|
|
|
"assert cap.isOpened(), \"Error reading video file\"\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Get video properties: width, height, and frames per second (fps)\n", |
|
|
|
|
"w, h, fps = (int(cap.get(x)) for x in (cv2.CAP_PROP_FRAME_WIDTH, cv2.CAP_PROP_FRAME_HEIGHT, cv2.CAP_PROP_FPS))\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Define points for a line or region of interest in the video frame\n", |
|
|
|
|
"line_points = [(20, 400), (1080, 400)] # Line coordinates\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Specify classes to count, for example: person (0) and car (2)\n", |
|
|
|
|
"classes_to_count = [0, 2] # Class IDs for person and car\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Initialize the video writer to save the output video\n", |
|
|
|
|
"video_writer = cv2.VideoWriter(\"object_counting_output.avi\", cv2.VideoWriter_fourcc(*\"mp4v\"), fps, (w, h))\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Initialize the Object Counter with visualization options and other parameters\n", |
|
|
|
|
"counter = solutions.ObjectCounter(\n", |
|
|
|
|
" view_img=True, # Display the image during processing\n", |
|
|
|
|
" reg_pts=line_points, # Region of interest points\n", |
|
|
|
|
" classes_names=model.names, # Class names from the YOLO model\n", |
|
|
|
|
" draw_tracks=True, # Draw tracking lines for objects\n", |
|
|
|
|
" line_thickness=2, # Thickness of the lines drawn\n", |
|
|
|
|
")\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Process video frames in a loop\n", |
|
|
|
|
"while cap.isOpened():\n", |
|
|
|
|
" success, im0 = cap.read()\n", |
|
|
|
|
" if not success:\n", |
|
|
|
|
" print(\"Video frame is empty or video processing has been successfully completed.\")\n", |
|
|
|
|
" break\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Perform object tracking on the current frame, filtering by specified classes\n", |
|
|
|
|
" tracks = model.track(im0, persist=True, show=False, classes=classes_to_count)\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Use the Object Counter to count objects in the frame and get the annotated image\n", |
|
|
|
|
" im0 = counter.start_counting(im0, tracks)\n", |
|
|
|
|
"\n", |
|
|
|
|
" # Write the annotated frame to the output video\n", |
|
|
|
|
" video_writer.write(im0)\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Release the video capture and writer objects\n", |
|
|
|
|
"cap.release()\n", |
|
|
|
|
"video_writer.release()\n", |
|
|
|
|
"\n", |
|
|
|
|
"# Close all OpenCV windows\n", |
|
|
|
|
"cv2.destroyAllWindows()" |
|
|
|
|
] |
|
|
|
|
}, |
|
|
|
|
{ |
|
|
|
|
"cell_type": "markdown", |
|
|
|
|
"metadata": { |
|
|
|
|
"id": "QrlKg-y3fEyD" |
|
|
|
|
}, |
|
|
|
|
"source": [ |
|
|
|
|
"# Additional Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Community Support\n", |
|
|
|
|
"\n", |
|
|
|
|
"For more information on counting objects with Ultralytics, you can explore the comprehensive [Ultralytics Object Counting Docs](https://docs.ultralytics.com/guides/object-counting/). This guide covers everything from basic concepts to advanced techniques, ensuring you get the most out of counting and visualization.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## Ultralytics ⚡ Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"At Ultralytics, we are committed to providing cutting-edge AI solutions. Here are some key resources to learn more about our company and get involved with our community:\n", |
|
|
|
|
"\n", |
|
|
|
|
"- [Ultralytics HUB](https://ultralytics.com/hub): Simplify your AI projects with Ultralytics HUB, our no-code tool for effortless YOLO training and deployment.\n", |
|
|
|
|
"- [Ultralytics Licensing](https://ultralytics.com/license): Review our licensing terms to understand how you can use our software in your projects.\n", |
|
|
|
|
"- [About Us](https://ultralytics.com/about): Discover our mission, vision, and the story behind Ultralytics.\n", |
|
|
|
|
"- [Join Our Team](https://ultralytics.com/work): Explore career opportunities and join our team of talented professionals.\n", |
|
|
|
|
"\n", |
|
|
|
|
"## YOLOv8 🚀 Resources\n", |
|
|
|
|
"\n", |
|
|
|
|
"YOLOv8 is the latest evolution in the YOLO series, offering state-of-the-art performance in object detection and image segmentation. Here are some essential resources to help you get started with YOLOv8:\n", |
|
|
|
|
"\n", |
|
|
|
|
"- [GitHub](https://github.com/ultralytics/ultralytics): Access the YOLOv8 repository on GitHub, where you can find the source code, contribute to the project, and report issues.\n", |
|
|
|
|
"- [Docs](https://docs.ultralytics.com/): Explore the official documentation for YOLOv8, including installation guides, tutorials, and detailed API references.\n", |
|
|
|
|
"- [Discord](https://ultralytics.com/discord): Join our Discord community to connect with other users, share your projects, and get help from the Ultralytics team.\n", |
|
|
|
|
"\n", |
|
|
|
|
"These resources are designed to help you leverage the full potential of Ultralytics' offerings and YOLOv8. Whether you're a beginner or an experienced developer, you'll find the information and support you need to succeed." |
|
|
|
|
] |
|
|
|
|
} |
|
|
|
|
], |
|
|
|
|
"metadata": { |
|
|
|
|
"accelerator": "GPU", |
|
|
|
|
"colab": { |
|
|
|
|
"gpuType": "T4", |
|
|
|
|
"provenance": [] |
|
|
|
|
}, |
|
|
|
|
"kernelspec": { |
|
|
|
|
"display_name": "Python 3", |
|
|
|
|
"name": "python3" |
|
|
|
|
}, |
|
|
|
|
"language_info": { |
|
|
|
|
"name": "python" |
|
|
|
|
} |
|
|
|
|
}, |
|
|
|
|
"nbformat": 4, |
|
|
|
|
"nbformat_minor": 0 |
|
|
|
|
} |
|
|
|
|