Update HTTP to HTTPS (#7548)

Signed-off-by: Glenn Jocher <glenn.jocher@ultralytics.com>
main
Glenn Jocher 1 year ago committed by GitHub
parent 83165ffe9c
commit 0da13831cf
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 6
      README.md
  2. 6
      README.zh-CN.md
  3. 2
      docs/ar/tasks/detect.md
  4. 2
      docs/ar/tasks/pose.md
  5. 2
      docs/ar/tasks/segment.md
  6. 2
      docs/de/tasks/detect.md
  7. 2
      docs/de/tasks/pose.md
  8. 2
      docs/de/tasks/segment.md
  9. 2
      docs/en/datasets/classify/imagenet.md
  10. 2
      docs/en/datasets/classify/imagenette.md
  11. 4
      docs/en/datasets/detect/globalwheat2020.md
  12. 22
      docs/en/robots.txt
  13. 2
      docs/en/tasks/detect.md
  14. 2
      docs/en/tasks/pose.md
  15. 2
      docs/en/tasks/segment.md
  16. 2
      docs/en/yolov5/tutorials/tips_for_best_training_results.md
  17. 2
      docs/en/yolov5/tutorials/train_custom_data.md
  18. 2
      docs/es/tasks/detect.md
  19. 2
      docs/es/tasks/pose.md
  20. 2
      docs/es/tasks/segment.md
  21. 2
      docs/fr/tasks/detect.md
  22. 2
      docs/fr/tasks/pose.md
  23. 2
      docs/fr/tasks/segment.md
  24. 2
      docs/hi/tasks/detect.md
  25. 2
      docs/hi/tasks/pose.md
  26. 2
      docs/hi/tasks/segment.md
  27. 2
      docs/ja/tasks/detect.md
  28. 2
      docs/ja/tasks/pose.md
  29. 2
      docs/ja/tasks/segment.md
  30. 2
      docs/ko/tasks/detect.md
  31. 2
      docs/ko/tasks/pose.md
  32. 2
      docs/ko/tasks/segment.md
  33. 2
      docs/pt/tasks/detect.md
  34. 2
      docs/pt/tasks/pose.md
  35. 2
      docs/pt/tasks/segment.md
  36. 2
      docs/ru/tasks/detect.md
  37. 2
      docs/ru/tasks/pose.md
  38. 2
      docs/ru/tasks/segment.md
  39. 2
      docs/zh/tasks/detect.md
  40. 2
      docs/zh/tasks/pose.md
  41. 2
      docs/zh/tasks/segment.md
  42. 2
      tests/test_python.py
  43. 2
      ultralytics/cfg/datasets/Argoverse.yaml
  44. 2
      ultralytics/cfg/datasets/GlobalWheat2020.yaml
  45. 2
      ultralytics/cfg/datasets/coco-pose.yaml
  46. 2
      ultralytics/cfg/datasets/coco.yaml
  47. 2
      ultralytics/data/scripts/get_coco.sh

@ -131,7 +131,7 @@ See [Detection Docs](https://docs.ultralytics.com/tasks/detect/) for usage examp
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val detect data=coco.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val detect data=coco.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val detect data=coco.yaml batch=1 device=0|cpu`
</details>
@ -165,7 +165,7 @@ See [Segmentation Docs](https://docs.ultralytics.com/tasks/segment/) for usage e
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val segment data=coco-seg.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val segment data=coco-seg.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val segment data=coco-seg.yaml batch=1 device=0|cpu`
</details>
@ -183,7 +183,7 @@ See [Pose Docs](https://docs.ultralytics.com/tasks/pose/) for usage examples wit
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO Keypoints val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val pose data=coco-pose.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO Keypoints val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val pose data=coco-pose.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val pose data=coco-pose.yaml batch=1 device=0|cpu`
</details>

@ -133,7 +133,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO val2017](http://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val detect data=coco.yaml device=0` 复现
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO val2017](https://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val detect data=coco.yaml device=0` 复现
- **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。 <br>通过 `yolo val detect data=coco.yaml batch=1 device=0|cpu` 复现
</details>
@ -167,7 +167,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO val2017](http://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val segment data=coco-seg.yaml device=0` 复现
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO val2017](https://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val segment data=coco-seg.yaml device=0` 复现
- **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。 <br>通过 `yolo val segment data=coco-seg.yaml batch=1 device=0|cpu` 复现
</details>
@ -185,7 +185,7 @@ Ultralytics 提供了 YOLOv8 的交互式笔记本,涵盖训练、验证、跟
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO Keypoints val2017](http://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val pose data=coco-pose.yaml device=0` 复现
- **mAP<sup>val</sup>** 值是基于单模型单尺度在 [COCO Keypoints val2017](https://cocodataset.org) 数据集上的结果。 <br>通过 `yolo val pose data=coco-pose.yaml device=0` 复现
- **速度** 是使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 实例对 COCO val 图像进行平均计算的。 <br>通过 `yolo val pose data=coco-pose.yaml batch=1 device=0|cpu` 复现
</details>

@ -41,7 +41,7 @@ Task التعرف على الكائنات هو عبارة عن تعرف على
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- قيم mAP<sup>val</sup> تنطبق على مقياس نموذج واحد-مقياس واحد على مجموعة بيانات [COCO val2017](http://cocodataset.org).
- قيم mAP<sup>val</sup> تنطبق على مقياس نموذج واحد-مقياس واحد على مجموعة بيانات [COCO val2017](https://cocodataset.org).
<br>اعيد حسابها بواسطة `yolo val detect data=coco.yaml device=0`
- السرعةتمت متوسطة على صور COCO val باستخدام [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)
instance.

@ -40,7 +40,7 @@ keywords: Ultralytics، YOLO، YOLOv8، تقدير الوضعية ، كشف نق
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- تعتبر القيم **mAP<sup>val</sup>** لنموذج واحد ومقياس واحد فقط على [COCO Keypoints val2017](http://cocodataset.org)
- تعتبر القيم **mAP<sup>val</sup>** لنموذج واحد ومقياس واحد فقط على [COCO Keypoints val2017](https://cocodataset.org)
مجموعة البيانات.
<br>يمكن إعادة إنتاجه بواسطة `يولو val pose data=coco-pose.yaml device=0`
- يتم حساب **السرعة** من خلال متوسط صور COCO val باستخدام [المروحة الحرارية Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)

@ -41,7 +41,7 @@ keywords: yolov8 ، فصل الأشكال الفردية ، Ultralytics ، مج
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- تُستخدم قيم **mAP<sup>val</sup>** لنموذج واحد وحجم واحد على مجموعة بيانات [COCO val2017](http://cocodataset.org).
- تُستخدم قيم **mAP<sup>val</sup>** لنموذج واحد وحجم واحد على مجموعة بيانات [COCO val2017](https://cocodataset.org).
<br>يمكن إعادة إنتاجها باستخدام `yolo val segment data=coco.yaml device=0`
- **تُحسب السرعة** كمتوسط على صور COCO val باستخدام [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)
instance.

@ -41,7 +41,7 @@ Hier werden die vortrainierten YOLOv8 Detect Modelle gezeigt. Detect, Segment un
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** Werte sind für Single-Modell Single-Scale auf dem [COCO val2017](http://cocodataset.org) Datensatz.
- **mAP<sup>val</sup>** Werte sind für Single-Modell Single-Scale auf dem [COCO val2017](https://cocodataset.org) Datensatz.
<br>Reproduzieren mit `yolo val detect data=coco.yaml device=0`
- **Geschwindigkeit** gemittelt über COCO Val Bilder mit einer [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)-Instanz.
<br>Reproduzieren mit `yolo val detect data=coco128.yaml batch=1 device=0|cpu`

@ -42,7 +42,7 @@ Hier werden vortrainierte YOLOv8 Pose-Modelle gezeigt. Erkennungs-, Segmentierun
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69,2 | 90,2 | 1607,1 | 3,73 | 69,4 | 263,2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71,6 | 91,2 | 4088,7 | 10,04 | 99,1 | 1066,4 |
- **mAP<sup>val</sup>** Werte gelten für ein einzelnes Modell mit einfacher Skala auf dem [COCO Keypoints val2017](http://cocodataset.org)-Datensatz.
- **mAP<sup>val</sup>** Werte gelten für ein einzelnes Modell mit einfacher Skala auf dem [COCO Keypoints val2017](https://cocodataset.org)-Datensatz.
<br>Zu reproduzieren mit `yolo val pose data=coco-pose.yaml device=0`.
- **Geschwindigkeit** gemittelt über COCO-Validierungsbilder mit einer [Amazon EC2 P4d](https://aws.amazon.com/de/ec2/instance-types/p4/)-Instanz.
<br>Zu reproduzieren mit `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`.

@ -41,7 +41,7 @@ Hier werden vortrainierte YOLOv8 Segment-Modelle gezeigt. Detect-, Segment- und
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- Die **mAP<sup>val</sup>**-Werte sind für ein einzelnes Modell, einzelne Skala auf dem [COCO val2017](http://cocodataset.org)-Datensatz.
- Die **mAP<sup>val</sup>**-Werte sind für ein einzelnes Modell, einzelne Skala auf dem [COCO val2017](https://cocodataset.org)-Datensatz.
<br>Zum Reproduzieren nutzen Sie `yolo val segment data=coco.yaml device=0`
- Die **Geschwindigkeit** ist über die COCO-Validierungsbilder gemittelt und verwendet eine [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)-Instanz.
<br>Zum Reproduzieren `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`

@ -21,7 +21,7 @@ The ImageNet dataset is organized using the WordNet hierarchy. Each node in the
## ImageNet Large Scale Visual Recognition Challenge (ILSVRC)
The annual [ImageNet Large Scale Visual Recognition Challenge (ILSVRC)](http://image-net.org/challenges/LSVRC/) has been an important event in the field of computer vision. It has provided a platform for researchers and developers to evaluate their algorithms and models on a large-scale dataset with standardized evaluation metrics. The ILSVRC has led to significant advancements in the development of deep learning models for image classification, object detection, and other computer vision tasks.
The annual [ImageNet Large Scale Visual Recognition Challenge (ILSVRC)](https://image-net.org/challenges/LSVRC/) has been an important event in the field of computer vision. It has provided a platform for researchers and developers to evaluate their algorithms and models on a large-scale dataset with standardized evaluation metrics. The ILSVRC has led to significant advancements in the development of deep learning models for image classification, object detection, and other computer vision tasks.
## Applications

@ -6,7 +6,7 @@ keywords: ImageNette dataset, Ultralytics, YOLO, Image classification, Machine L
# ImageNette Dataset
The [ImageNette](https://github.com/fastai/imagenette) dataset is a subset of the larger [Imagenet](http://www.image-net.org/) dataset, but it only includes 10 easily distinguishable classes. It was created to provide a quicker, easier-to-use version of Imagenet for software development and education.
The [ImageNette](https://github.com/fastai/imagenette) dataset is a subset of the larger [Imagenet](https://www.image-net.org/) dataset, but it only includes 10 easily distinguishable classes. It was created to provide a quicker, easier-to-use version of Imagenet for software development and education.
## Key Features

@ -6,7 +6,7 @@ keywords: Ultralytics, YOLO, Global Wheat Head Dataset, wheat head detection, pl
# Global Wheat Head Dataset
The [Global Wheat Head Dataset](http://www.global-wheat.com/) is a collection of images designed to support the development of accurate wheat head detection models for applications in wheat phenotyping and crop management. Wheat heads, also known as spikes, are the grain-bearing parts of the wheat plant. Accurate estimation of wheat head density and size is essential for assessing crop health, maturity, and yield potential. The dataset, created by a collaboration of nine research institutes from seven countries, covers multiple growing regions to ensure models generalize well across different environments.
The [Global Wheat Head Dataset](https://www.global-wheat.com/) is a collection of images designed to support the development of accurate wheat head detection models for applications in wheat phenotyping and crop management. Wheat heads, also known as spikes, are the grain-bearing parts of the wheat plant. Accurate estimation of wheat head density and size is essential for assessing crop health, maturity, and yield potential. The dataset, created by a collaboration of nine research institutes from seven countries, covers multiple growing regions to ensure models generalize well across different environments.
## Key Features
@ -88,4 +88,4 @@ If you use the Global Wheat Head Dataset in your research or development work, p
}
```
We would like to acknowledge the researchers and institutions that contributed to the creation and maintenance of the Global Wheat Head Dataset as a valuable resource for the plant phenotyping and crop management research community. For more information about the dataset and its creators, visit the [Global Wheat Head Dataset website](http://www.global-wheat.com/).
We would like to acknowledge the researchers and institutions that contributed to the creation and maintenance of the Global Wheat Head Dataset as a valuable resource for the plant phenotyping and crop management research community. For more information about the dataset and its creators, visit the [Global Wheat Head Dataset website](https://www.global-wheat.com/).

@ -1,12 +1,12 @@
User-agent: *
Sitemap: http://docs.ultralytics.com/sitemap.xml
Sitemap: http://docs.ultralytics.com/ar/sitemap.xml
Sitemap: http://docs.ultralytics.com/de/sitemap.xml
Sitemap: http://docs.ultralytics.com/es/sitemap.xml
Sitemap: http://docs.ultralytics.com/fr/sitemap.xml
Sitemap: http://docs.ultralytics.com/hi/sitemap.xml
Sitemap: http://docs.ultralytics.com/ja/sitemap.xml
Sitemap: http://docs.ultralytics.com/ko/sitemap.xml
Sitemap: http://docs.ultralytics.com/pt/sitemap.xml
Sitemap: http://docs.ultralytics.com/ru/sitemap.xml
Sitemap: http://docs.ultralytics.com/zh/sitemap.xml
Sitemap: https://docs.ultralytics.com/sitemap.xml
Sitemap: https://docs.ultralytics.com/ar/sitemap.xml
Sitemap: https://docs.ultralytics.com/de/sitemap.xml
Sitemap: https://docs.ultralytics.com/es/sitemap.xml
Sitemap: https://docs.ultralytics.com/fr/sitemap.xml
Sitemap: https://docs.ultralytics.com/hi/sitemap.xml
Sitemap: https://docs.ultralytics.com/ja/sitemap.xml
Sitemap: https://docs.ultralytics.com/ko/sitemap.xml
Sitemap: https://docs.ultralytics.com/pt/sitemap.xml
Sitemap: https://docs.ultralytics.com/ru/sitemap.xml
Sitemap: https://docs.ultralytics.com/zh/sitemap.xml

@ -41,7 +41,7 @@ YOLOv8 pretrained Detect models are shown here. Detect, Segment and Pose models
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val detect data=coco.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val detect data=coco.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val detect data=coco128.yaml batch=1 device=0|cpu`
## Train

@ -42,7 +42,7 @@ YOLOv8 pretrained Pose models are shown here. Detect, Segment and Pose models ar
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO Keypoints val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val pose data=coco-pose.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO Keypoints val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val pose data=coco-pose.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`
## Train

@ -41,7 +41,7 @@ YOLOv8 pretrained Segment models are shown here. Detect, Segment and Pose models
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](http://cocodataset.org) dataset. <br>Reproduce by `yolo val segment data=coco.yaml device=0`
- **mAP<sup>val</sup>** values are for single-model single-scale on [COCO val2017](https://cocodataset.org) dataset. <br>Reproduce by `yolo val segment data=coco.yaml device=0`
- **Speed** averaged over COCO val images using an [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) instance. <br>Reproduce by `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`
## Train

@ -60,6 +60,6 @@ Before modifying anything, **first train with default settings to establish a pe
## Further Reading
If you'd like to know more, a good place to start is Karpathy's 'Recipe for Training Neural Networks', which has great ideas for training that apply broadly across all ML domains: [http://karpathy.github.io/2019/04/25/recipe/](http://karpathy.github.io/2019/04/25/recipe/)
If you'd like to know more, a good place to start is Karpathy's 'Recipe for Training Neural Networks', which has great ideas for training that apply broadly across all ML domains: [https://karpathy.github.io/2019/04/25/recipe/](https://karpathy.github.io/2019/04/25/recipe/)
Good luck 🍀 and let us know if you have any other questions!

@ -77,7 +77,7 @@ Export in `YOLOv5 Pytorch` format, then copy the snippet into your training scri
### 2.1 Create `dataset.yaml`
[COCO128](https://www.kaggle.com/ultralytics/coco128) is an example small tutorial dataset composed of the first 128 images in [COCO](http://cocodataset.org/#home) train2017. These same 128 images are used for both training and validation to verify our training pipeline is capable of overfitting. [data/coco128.yaml](https://github.com/ultralytics/yolov5/blob/master/data/coco128.yaml), shown below, is the dataset config file that defines 1) the dataset root directory `path` and relative paths to `train` / `val` / `test` image directories (or *.txt files with image paths) and 2) a class `names` dictionary:
[COCO128](https://www.kaggle.com/ultralytics/coco128) is an example small tutorial dataset composed of the first 128 images in [COCO](https://cocodataset.org/) train2017. These same 128 images are used for both training and validation to verify our training pipeline is capable of overfitting. [data/coco128.yaml](https://github.com/ultralytics/yolov5/blob/master/data/coco128.yaml), shown below, is the dataset config file that defines 1) the dataset root directory `path` and relative paths to `train` / `val` / `test` image directories (or *.txt files with image paths) and 2) a class `names` dictionary:
```yaml
# Train/val/test sets as 1) dir: path/to/imgs, 2) file: path/to/imgs.txt, or 3) list: [path/to/imgs1, path/to/imgs2, ..]

@ -41,7 +41,7 @@ Los [modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- Los valores de **mAP<sup>val</sup>** son para un solo modelo a una sola escala en el conjunto de datos [COCO val2017](http://cocodataset.org).
- Los valores de **mAP<sup>val</sup>** son para un solo modelo a una sola escala en el conjunto de datos [COCO val2017](https://cocodataset.org).
<br>Reproduce utilizando `yolo val detect data=coco.yaml device=0`
- La **Velocidad** es el promedio sobre las imágenes de COCO val utilizando una instancia [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Reproduce utilizando `yolo val detect data=coco128.yaml batch=1 device=0|cpu`

@ -42,7 +42,7 @@ Los [modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- Los valores de **mAP<sup>val</sup>** son para un solo modelo a una sola escala en el conjunto de datos [COCO Keypoints val2017](http://cocodataset.org).
- Los valores de **mAP<sup>val</sup>** son para un solo modelo a una sola escala en el conjunto de datos [COCO Keypoints val2017](https://cocodataset.org).
<br>Reproducir con `yolo val pose data=coco-pose.yaml device=0`
- **Velocidad** promediada sobre imágenes COCO val usando una instancia [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Reproducir con `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ Los [Modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/c
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- Los valores **mAP<sup>val</sup>** son para un único modelo a una única escala en el conjunto de datos [COCO val2017](http://cocodataset.org).
- Los valores **mAP<sup>val</sup>** son para un único modelo a una única escala en el conjunto de datos [COCO val2017](https://cocodataset.org).
<br>Reproducir utilizando `yolo val segment data=coco.yaml device=0`
- La **Velocidad** promediada sobre imágenes de COCO val utilizando una instancia de [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Reproducir utilizando `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ Les modèles pré-entraînés Detect YOLOv8 sont présentés ici. Les modèles D
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- Les valeurs de **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](http://cocodataset.org).
- Les valeurs de **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](https://cocodataset.org).
<br>Reproductible avec `yolo val detect data=coco.yaml device=0`
- La **Vitesse** est moyennée sur les images COCO val en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/fr/ec2/instance-types/p4/).
<br>Reproductible avec `yolo val detect data=coco128.yaml batch=1 device=0|cpu`

@ -33,7 +33,7 @@ Les [Modèles](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- Les valeurs de **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO Keypoints val2017](http://cocodataset.org).
- Les valeurs de **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO Keypoints val2017](https://cocodataset.org).
<br>Reproduire avec `yolo val pose data=coco-pose.yaml device=0`
- La **vitesse** moyenne sur les images de validation COCO en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Reproduire avec `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ Les [modèles](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- Les valeurs **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](http://cocodataset.org).
- Les valeurs **mAP<sup>val</sup>** sont pour un seul modèle à une seule échelle sur le jeu de données [COCO val2017](https://cocodataset.org).
<br>Pour reproduire, utilisez `yolo val segment data=coco.yaml device=0`
- **Vitesse** moyennée sur les images COCO val en utilisant une instance [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Pour reproduire, utilisez `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`

@ -42,7 +42,7 @@ YOLOv8 पव परशिित Detect मडल यह
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>**न क [COCO val2017](http://cocodataset.org) डट पर सिगल-मल सिगल-सल किए ह
- **mAP<sup>val</sup>**न क [COCO val2017](https://cocodataset.org) डट पर सिगल-मल सिगल-सल किए ह
<br>`yolo` द उतपनन कर `किस कर yolo val data=coco.yaml device=0`
- **Speed** [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)
स क उपयग करक COCO val छवि पर औसत लि

@ -42,7 +42,7 @@ YOLOv8 पित पज मडलस यह
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>**न एकल मडल एकल सल पर [COCO कट val2017](http://cocodataset.org) डट पर ह
- **mAP<sup>val</sup>**न एकल मडल एकल सल पर [COCO कट val2017](https://cocodataset.org) डट पर ह
<br>`yolo val pose data=coco-pose.yaml device=0` कनरित कर
- **Speed** [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) इनस क उपयग करतए COCO val छवि पर औसतित गणन
<br>`yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` कनरचन कर

@ -39,7 +39,7 @@ YOLOv8 पव परशिित Segment मडल य
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>**न एकल मडल एकल सल किए [COCO val2017](http://cocodataset.org) डट पर ह
- **mAP<sup>val</sup>**न एकल मडल एकल सल किए [COCO val2017](https://cocodataset.org) डट पर ह
<br>`yolo val segment data=coco.yaml device=0` कनरित किए ज
- **सड** एक [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) इस क उपयग करतए COCO val छविच औसतन।
<br>`yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` कनरित किए ज सकत

@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 物体検出, 事前訓練済みモデル, トレ
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** の値は[COCO val2017](http://cocodataset.org)データセットにおいて、単一モデル単一スケールでのものです。
- **mAP<sup>val</sup>** の値は[COCO val2017](https://cocodataset.org)データセットにおいて、単一モデル単一スケールでのものです。
<br>再現方法: `yolo val detect data=coco.yaml device=0`
- **速度** は[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用してCOCO val画像に対して平均化されたものです。
<br>再現方法: `yolo val detect data=coco128.yaml batch=1 device=0|cpu`

@ -42,7 +42,7 @@ YOLOv8事前トレーニング済みポーズモデルはこちらです。Detec
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** の値は、[COCO Keypoints val2017](http://cocodataset.org)データセットでの単一モデル単一スケールに対するものです。
- **mAP<sup>val</sup>** の値は、[COCO Keypoints val2017](https://cocodataset.org)データセットでの単一モデル単一スケールに対するものです。
<br>再現方法 `yolo val pose data=coco-pose.yaml device=0`
- **速度** は [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用したCOCO val画像の平均です。
<br>再現方法 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ keywords: yolov8, インスタンスセグメンテーション, Ultralytics, CO
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>**の値は[COCO val2017](http://cocodataset.org)データセットでの単一モデル単一スケールの値です。
- **mAP<sup>val</sup>**の値は[COCO val2017](https://cocodataset.org)データセットでの単一モデル単一スケールの値です。
<br>再現するには `yolo val segment data=coco.yaml device=0`
- **スピード**は[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)インスタンスを使用してCOCO val画像で平均化されます。
<br>再現するには `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 객체 감지, 사전 훈련된 모델, 훈련,
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** 값은 [COCO val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일을 사용한 값입니다.
- **mAP<sup>val</sup>** 값은 [COCO val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일을 사용한 값입니다.
<br>[COCO](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco.yaml) 데이터와 `yolo val detect data=coco.yaml device=0` 명령으로 재현할 수 있습니다.
- **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 사용해 COCO val 이미지들을 평균한 것입니다.
<br>[COCO128](https://github.com/ultralytics/ultralytics/blob/main/ultralytics/cfg/datasets/coco128.yaml) 데이터와 `yolo val detect data=coco128.yaml batch=1 device=0|cpu` 명령으로 재현할 수 있습니다.

@ -42,7 +42,7 @@ keywords: Ultralytics, YOLO, YOLOv8, 포즈 추정, 키포인트 검출, 객체
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** 값은 [COCO Keypoints val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 규모를 기준으로 합니다.
- **mAP<sup>val</sup>** 값은 [COCO Keypoints val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 규모를 기준으로 합니다.
<br>재현하려면 `yolo val pose data=coco-pose.yaml device=0`을 사용하세요.
- **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 사용하여 COCO val 이미지 평균입니다.
<br>재현하려면 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`를 사용하세요.

@ -41,7 +41,7 @@ keywords: yolov8, 인스턴스 세그멘테이션, Ultralytics, COCO 데이터
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>** 값들은 [COCO val2017](http://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일로 얻은 값입니다.
- **mAP<sup>val</sup>** 값들은 [COCO val2017](https://cocodataset.org) 데이터셋에서 단일 모델 단일 스케일로 얻은 값입니다.
<br>복제는 `yolo val segment data=coco.yaml device=0` 명령어로 실행할 수 있습니다.
- **속도**는 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 인스턴스를 이용하여 COCO 검증 이미지로 평균 내었습니다.
<br>복제는 `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu` 명령어로 실행할 수 있습니다.

@ -41,7 +41,7 @@ Os [Modelos](https://github.com/ultralytics/ultralytics/tree/main/ultralytics/cf
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- Os valores de **mAP<sup>val</sup>** são para um único modelo e uma única escala no dataset [COCO val2017](http://cocodataset.org).
- Os valores de **mAP<sup>val</sup>** são para um único modelo e uma única escala no dataset [COCO val2017](https://cocodataset.org).
<br>Reproduza usando `yolo val detect data=coco.yaml device=0`
- A **Velocidade** é média tirada sobre as imagens do COCO val num [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)
instância.

@ -42,7 +42,7 @@ Os modelos YOLOv8 Pose pré-treinados são mostrados aqui. Os modelos Detect, Se
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** valores são para um único modelo em escala única no conjunto de dados [COCO Keypoints val2017](http://cocodataset.org)
- **mAP<sup>val</sup>** valores são para um único modelo em escala única no conjunto de dados [COCO Keypoints val2017](https://cocodataset.org)
.
<br>Reproduza `yolo val pose data=coco-pose.yaml device=0`
- **Velocidade** média em imagens COCO val usando uma instância [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)

@ -41,7 +41,7 @@ Os modelos Segment pré-treinados do YOLOv8 estão mostrados aqui. Os modelos De
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- Os valores de **mAP<sup>val</sup>** são para um único modelo em uma única escala no conjunto de dados [COCO val2017](http://cocodataset.org).
- Os valores de **mAP<sup>val</sup>** são para um único modelo em uma única escala no conjunto de dados [COCO val2017](https://cocodataset.org).
<br>Reproduza por meio de `yolo val segment data=coco.yaml device=0`
- **Velocidade** média em imagens COCO val usando uma instância [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Reproduza por meio de `yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, обнаружение объектов, пред
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** значения для одиночной модели одиночного масштаба на датасете [COCO val2017](http://cocodataset.org).
- **mAP<sup>val</sup>** значения для одиночной модели одиночного масштаба на датасете [COCO val2017](https://cocodataset.org).
<br>Для воспроизведения используйте `yolo val detect data=coco.yaml device=0`
- **Скорость** усреднена по изображениям COCO val на экземпляре [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/).
<br>Для воспроизведения используйте `yolo val detect data=coco128.yaml batch=1 device=0|cpu`

@ -32,7 +32,7 @@ description: Узнайте, как использовать Ultralytics YOLOv8
| [YOLOv8x-pose](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-pose-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** значения для одной модели одиночного масштаба на наборе данных [COCO Keypoints val2017](http://cocodataset.org).
- **mAP<sup>val</sup>** значения для одной модели одиночного масштаба на наборе данных [COCO Keypoints val2017](https://cocodataset.org).
<br>Воспроизводится с помощью: `yolo val pose data=coco-pose.yaml device=0`
- **Скорость** усреднена по изображениям COCO val на [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) инстансе.
<br>Воспроизводится с помощью: `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu`

@ -41,7 +41,7 @@ keywords: yolov8, сегментация объектов, Ultralytics, набо
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- Значения **mAP<sup>val</sup>** для одиночной модели одиночного масштаба на наборе данных [COCO val2017](http://cocodataset.org).
- Значения **mAP<sup>val</sup>** для одиночной модели одиночного масштаба на наборе данных [COCO val2017](https://cocodataset.org).
<br>Воспроизведите с помощью `yolo val segment data=coco.yaml device=0`
- **Скорость** усреднена для изображений COCO val на [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)
инстансе.

@ -41,7 +41,7 @@ keywords: YOLOv8, Ultralytics, 目标检测, 预训练模型, 训练, 验证,
| [YOLOv8l](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l.pt) | 640 | 52.9 | 375.2 | 2.39 | 43.7 | 165.2 |
| [YOLOv8x](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x.pt) | 640 | 53.9 | 479.1 | 3.53 | 68.2 | 257.8 |
- **mAP<sup>val</sup>** 值适用于 [COCO val2017](http://cocodataset.org) 数据集上的单模型单尺度。
- **mAP<sup>val</sup>** 值适用于 [COCO val2017](https://cocodataset.org) 数据集上的单模型单尺度。
<br>通过 `yolo val detect data=coco.yaml device=0` 复现。
- **速度** 是在使用 [Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/) 云实例对COCO val图像的平均值。
<br>通过 `yolo val detect data=coco128.yaml batch=1 device=0|cpu` 复现。

@ -42,7 +42,7 @@ keywords: Ultralytics, YOLO, YOLOv8, 姿态估计, 关键点检测, 物体检测
| [YOLOv8x-姿态](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose.pt) | 640 | 69.2 | 90.2 | 1607.1 | 3.73 | 69.4 | 263.2 |
| [YOLOv8x-姿态-p6](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-pose-p6.pt) | 1280 | 71.6 | 91.2 | 4088.7 | 10.04 | 99.1 | 1066.4 |
- **mAP<sup>val</sup>** 值适用于[COCO 关键点 val2017](http://cocodataset.org)数据集上的单模型单尺度。
- **mAP<sup>val</sup>** 值适用于[COCO 关键点 val2017](https://cocodataset.org)数据集上的单模型单尺度。
<br>通过执行 `yolo val pose data=coco-pose.yaml device=0` 来复现。
- **速度** 是在 [亚马逊EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)实例上使用COCO val图像的平均值。
<br>通过执行 `yolo val pose data=coco8-pose.yaml batch=1 device=0|cpu` 来复现。

@ -41,7 +41,7 @@ keywords: yolov8, 实例分割, Ultralytics, COCO数据集, 图像分割, 物体
| [YOLOv8l-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8l-seg.pt) | 640 | 52.3 | 42.6 | 572.4 | 2.79 | 46.0 | 220.5 |
| [YOLOv8x-seg](https://github.com/ultralytics/assets/releases/download/v8.1.0/yolov8x-seg.pt) | 640 | 53.4 | 43.4 | 712.1 | 4.02 | 71.8 | 344.1 |
- **mAP<sup>val</sup>** 值针对[COCO val2017](http://cocodataset.org)数据集的单模型单尺度。
- **mAP<sup>val</sup>** 值针对[COCO val2017](https://cocodataset.org)数据集的单模型单尺度。
<br>通过`yolo val segment data=coco.yaml device=0`复现。
- **速度** 基于在[Amazon EC2 P4d](https://aws.amazon.com/ec2/instance-types/p4/)实例上运行的COCO val图像的平均值。
<br>通过`yolo val segment data=coco128-seg.yaml batch=1 device=0|cpu`复现。

@ -520,7 +520,7 @@ def test_hub():
export_fmts_hub()
logout()
smart_request('GET', 'http://github.com', progress=True)
smart_request('GET', 'https://github.com', progress=True)
@pytest.fixture

@ -1,5 +1,5 @@
# Ultralytics YOLO 🚀, AGPL-3.0 license
# Argoverse-HD dataset (ring-front-center camera) http://www.cs.cmu.edu/~mengtial/proj/streaming/ by Argo AI
# Argoverse-HD dataset (ring-front-center camera) https://www.cs.cmu.edu/~mengtial/proj/streaming/ by Argo AI
# Documentation: https://docs.ultralytics.com/datasets/detect/argoverse/
# Example usage: yolo train data=Argoverse.yaml
# parent

@ -1,5 +1,5 @@
# Ultralytics YOLO 🚀, AGPL-3.0 license
# Global Wheat 2020 dataset http://www.global-wheat.com/ by University of Saskatchewan
# Global Wheat 2020 dataset https://www.global-wheat.com/ by University of Saskatchewan
# Documentation: https://docs.ultralytics.com/datasets/detect/globalwheat2020/
# Example usage: yolo train data=GlobalWheat2020.yaml
# parent

@ -1,5 +1,5 @@
# Ultralytics YOLO 🚀, AGPL-3.0 license
# COCO 2017 dataset http://cocodataset.org by Microsoft
# COCO 2017 dataset https://cocodataset.org by Microsoft
# Documentation: https://docs.ultralytics.com/datasets/pose/coco/
# Example usage: yolo train data=coco-pose.yaml
# parent

@ -1,5 +1,5 @@
# Ultralytics YOLO 🚀, AGPL-3.0 license
# COCO 2017 dataset http://cocodataset.org by Microsoft
# COCO 2017 dataset https://cocodataset.org by Microsoft
# Documentation: https://docs.ultralytics.com/datasets/detect/coco/
# Example usage: yolo train data=coco.yaml
# parent

@ -1,6 +1,6 @@
#!/bin/bash
# Ultralytics YOLO 🚀, AGPL-3.0 license
# Download COCO 2017 dataset http://cocodataset.org
# Download COCO 2017 dataset https://cocodataset.org
# Example usage: bash data/scripts/get_coco.sh
# parent
# ├── ultralytics

Loading…
Cancel
Save