From 6ebbe17bd82498140e5001ef8d39dfc023163a5d Mon Sep 17 00:00:00 2001 From: Muhammad Rizwan Munawar Date: Tue, 22 Oct 2024 22:50:08 +0500 Subject: [PATCH] Add YOLO publication notice in Docs (#17095) Co-authored-by: UltralyticsAssistant Co-authored-by: Glenn Jocher --- docs/en/models/yolo11.md | 6 +++++- docs/en/models/yolov5.md | 6 +++++- docs/en/models/yolov8.md | 4 ++++ 3 files changed, 14 insertions(+), 2 deletions(-) diff --git a/docs/en/models/yolo11.md b/docs/en/models/yolo11.md index 8baf2dd725..fe9115f2ed 100644 --- a/docs/en/models/yolo11.md +++ b/docs/en/models/yolo11.md @@ -8,9 +8,13 @@ keywords: YOLO11, state-of-the-art object detection, YOLO series, Ultralytics, c ## Overview +!!! tip "Ultralytics YOLO11 Publication" + + Ultralytics has not published a formal research paper for YOLO11 due to the rapidly evolving nature of the models. We focus on advancing the technology and making it easier to use, rather than producing static documentation. For the most up-to-date information on YOLO architecture, features, and usage, please refer to our [GitHub repository](https://github.com/ultralytics/ultralytics) and [documentation](https://docs.ultralytics.com). + YOLO11 is the latest iteration in the [Ultralytics](https://www.ultralytics.com/) YOLO series of real-time object detectors, redefining what's possible with cutting-edge [accuracy](https://www.ultralytics.com/glossary/accuracy), speed, and efficiency. Building upon the impressive advancements of previous YOLO versions, YOLO11 introduces significant improvements in architecture and training methods, making it a versatile choice for a wide range of [computer vision](https://www.ultralytics.com/glossary/computer-vision-cv) tasks. -![Ultralytics YOLO11 Comparison Plots](hhttps://raw.githubusercontent.com/ultralytics/assets/refs/heads/main/yolo/performance-comparison.png) +![Ultralytics YOLO11 Comparison Plots](https://raw.githubusercontent.com/ultralytics/assets/refs/heads/main/yolo/performance-comparison.png)


diff --git a/docs/en/models/yolov5.md b/docs/en/models/yolov5.md index 8ff1c36ec0..91c562a44e 100644 --- a/docs/en/models/yolov5.md +++ b/docs/en/models/yolov5.md @@ -4,7 +4,11 @@ description: Explore YOLOv5u, an advanced object detection model with optimized keywords: YOLOv5, YOLOv5u, object detection, Ultralytics, anchor-free, pre-trained models, accuracy, speed, real-time detection --- -# YOLOv5 +# Ultralytics YOLOv5 + +!!! tip "Ultralytics YOLOv5 Publication" + + Ultralytics has not published a formal research paper for YOLOv5 due to the rapidly evolving nature of the models. We focus on advancing the technology and making it easier to use, rather than producing static documentation. For the most up-to-date information on YOLO architecture, features, and usage, please refer to our [GitHub repository](https://github.com/ultralytics/ultralytics) and [documentation](https://docs.ultralytics.com). ## Overview diff --git a/docs/en/models/yolov8.md b/docs/en/models/yolov8.md index 036cd305a1..c8e4397d15 100644 --- a/docs/en/models/yolov8.md +++ b/docs/en/models/yolov8.md @@ -6,6 +6,10 @@ keywords: YOLOv8, real-time object detection, YOLO series, Ultralytics, computer # Ultralytics YOLOv8 +!!! tip "Ultralytics YOLOv8 Publication" + + Ultralytics has not published a formal research paper for YOLOv8 due to the rapidly evolving nature of the models. We focus on advancing the technology and making it easier to use, rather than producing static documentation. For the most up-to-date information on YOLO architecture, features, and usage, please refer to our [GitHub repository](https://github.com/ultralytics/ultralytics) and [documentation](https://docs.ultralytics.com). + ## Overview YOLOv8 is the latest iteration in the YOLO series of real-time object detectors, offering cutting-edge performance in terms of accuracy and speed. Building upon the advancements of previous YOLO versions, YOLOv8 introduces new features and optimizations that make it an ideal choice for various [object detection](https://www.ultralytics.com/glossary/object-detection) tasks in a wide range of applications.