8.8 KiB
comments | description | keywords |
---|---|---|
true | Learn how to validate your YOLO11 model with precise metrics, easy-to-use tools, and custom settings for optimal performance. | Ultralytics, YOLO11, model validation, machine learning, object detection, mAP metrics, Python API, CLI |
Model Validation with Ultralytics YOLO
Introduction
Validation is a critical step in the machine learning pipeline, allowing you to assess the quality of your trained models. Val mode in Ultralytics YOLO11 provides a robust suite of tools and metrics for evaluating the performance of your object detection models. This guide serves as a complete resource for understanding how to effectively use the Val mode to ensure that your models are both accurate and reliable.
Watch: Ultralytics Modes Tutorial: Validation
Why Validate with Ultralytics YOLO?
Here's why using YOLO11's Val mode is advantageous:
- Precision: Get accurate metrics like mAP50, mAP75, and mAP50-95 to comprehensively evaluate your model.
- Convenience: Utilize built-in features that remember training settings, simplifying the validation process.
- Flexibility: Validate your model with the same or different datasets and image sizes.
- Hyperparameter Tuning: Use validation metrics to fine-tune your model for better performance.
Key Features of Val Mode
These are the notable functionalities offered by YOLO11's Val mode:
- Automated Settings: Models remember their training configurations for straightforward validation.
- Multi-Metric Support: Evaluate your model based on a range of accuracy metrics.
- CLI and Python API: Choose from command-line interface or Python API based on your preference for validation.
- Data Compatibility: Works seamlessly with datasets used during the training phase as well as custom datasets.
!!! tip
* YOLO11 models automatically remember their training settings, so you can validate a model at the same image size and on the original dataset easily with just `yolo val model=yolo11n.pt` or `model('yolo11n.pt').val()`
Usage Examples
Validate trained YOLO11n model accuracy on the COCO8 dataset. No arguments are needed as the model
retains its training data
and arguments as model attributes. See Arguments section below for a full list of export arguments.
!!! example
=== "Python"
```python
from ultralytics import YOLO
# Load a model
model = YOLO("yolo11n.pt") # load an official model
model = YOLO("path/to/best.pt") # load a custom model
# Validate the model
metrics = model.val() # no arguments needed, dataset and settings remembered
metrics.box.map # map50-95
metrics.box.map50 # map50
metrics.box.map75 # map75
metrics.box.maps # a list contains map50-95 of each category
```
=== "CLI"
```bash
yolo detect val model=yolo11n.pt # val official model
yolo detect val model=path/to/best.pt # val custom model
```
Arguments for YOLO Model Validation
When validating YOLO models, several arguments can be fine-tuned to optimize the evaluation process. These arguments control aspects such as input image size, batch processing, and performance thresholds. Below is a detailed breakdown of each argument to help you customize your validation settings effectively.
{% include "macros/validation-args.md" %}
Each of these settings plays a vital role in the validation process, allowing for a customizable and efficient evaluation of YOLO models. Adjusting these parameters according to your specific needs and resources can help achieve the best balance between accuracy and performance.
Example Validation with Arguments
The below examples showcase YOLO model validation with custom arguments in Python and CLI.
!!! example
=== "Python"
```python
from ultralytics import YOLO
# Load a model
model = YOLO("yolo11n.pt")
# Customize validation settings
validation_results = model.val(data="coco8.yaml", imgsz=640, batch=16, conf=0.25, iou=0.6, device="0")
```
=== "CLI"
```bash
yolo val model=yolo11n.pt data=coco8.yaml imgsz=640 batch=16 conf=0.25 iou=0.6 device=0
```
FAQ
How do I validate my YOLO11 model with Ultralytics?
To validate your YOLO11 model, you can use the Val mode provided by Ultralytics. For example, using the Python API, you can load a model and run validation with:
from ultralytics import YOLO
# Load a model
model = YOLO("yolo11n.pt")
# Validate the model
metrics = model.val()
print(metrics.box.map) # map50-95
Alternatively, you can use the command-line interface (CLI):
yolo val model=yolo11n.pt
For further customization, you can adjust various arguments like imgsz
, batch
, and conf
in both Python and CLI modes. Check the Arguments for YOLO Model Validation section for the full list of parameters.
What metrics can I get from YOLO11 model validation?
YOLO11 model validation provides several key metrics to assess model performance. These include:
- mAP50 (mean Average Precision at IoU threshold 0.5)
- mAP75 (mean Average Precision at IoU threshold 0.75)
- mAP50-95 (mean Average Precision across multiple IoU thresholds from 0.5 to 0.95)
Using the Python API, you can access these metrics as follows:
metrics = model.val() # assumes `model` has been loaded
print(metrics.box.map) # mAP50-95
print(metrics.box.map50) # mAP50
print(metrics.box.map75) # mAP75
print(metrics.box.maps) # list of mAP50-95 for each category
For a complete performance evaluation, it's crucial to review all these metrics. For more details, refer to the Key Features of Val Mode.
What are the advantages of using Ultralytics YOLO for validation?
Using Ultralytics YOLO for validation provides several advantages:
- Precision: YOLO11 offers accurate performance metrics including mAP50, mAP75, and mAP50-95.
- Convenience: The models remember their training settings, making validation straightforward.
- Flexibility: You can validate against the same or different datasets and image sizes.
- Hyperparameter Tuning: Validation metrics help in fine-tuning models for better performance.
These benefits ensure that your models are evaluated thoroughly and can be optimized for superior results. Learn more about these advantages in the Why Validate with Ultralytics YOLO section.
Can I validate my YOLO11 model using a custom dataset?
Yes, you can validate your YOLO11 model using a custom dataset. Specify the data
argument with the path to your dataset configuration file. This file should include paths to the validation data, class names, and other relevant details.
Example in Python:
from ultralytics import YOLO
# Load a model
model = YOLO("yolo11n.pt")
# Validate with a custom dataset
metrics = model.val(data="path/to/your/custom_dataset.yaml")
print(metrics.box.map) # map50-95
Example using CLI:
yolo val model=yolo11n.pt data=path/to/your/custom_dataset.yaml
For more customizable options during validation, see the Example Validation with Arguments section.
How do I save validation results to a JSON file in YOLO11?
To save the validation results to a JSON file, you can set the save_json
argument to True
when running validation. This can be done in both the Python API and CLI.
Example in Python:
from ultralytics import YOLO
# Load a model
model = YOLO("yolo11n.pt")
# Save validation results to JSON
metrics = model.val(save_json=True)
Example using CLI:
yolo val model=yolo11n.pt save_json=True
This functionality is particularly useful for further analysis or integration with other tools. Check the Arguments for YOLO Model Validation for more details.