@ -26,10 +26,45 @@ reCamera series is purpose-built for edge AI applications, tailored to meet the
## Quick Hardware Setup of reCamera
Please follow [reCamera Quick Start Guide](https://wiki.seeedstudio.com/recamera_getting_started) for initial onboarding of the device such as connecting the device to a WiFi network and access the [Node-RED](https://nodered.org) web UI for quick previewing of detection redsults with the pre-installed Ultralytics YOLO models.
Please follow [reCamera Quick Start Guide](https://wiki.seeedstudio.com/recamera_getting_started) for initial onboarding of the device such as connecting the device to a WiFi network and access the [Node-RED](https://nodered.org) web UI for quick previewing of detection results.
## Inference Using Pre-installed YOLO11 Models
reCamera comes pre-installed with four Ultralytics YOLO11 models and you can simply choose your desired model within the Node-RED dashboard.
Step 1: If you have connected reCamera to a network, enter the IP address of reCamera on a web browser to open the Node-RED dashboard. If you have connected the reCamera to a PC via USB, you can enter `192.168.42.1`. Here you will see YOLO11n detection model is loaded by default.
## Export to cvimodel: Converting Your YOLO11 Model
If you want to use a [custom-trained YOLO11 model](../modes/train.md) and use with reCamera, please follow the steps below.
Here we will first convert `PyTorch` model to `ONNX` and then convert it to `MLIR` model format. Finally `MLIR` will be converted to `cvimodel` in order to inference on-device
<palign="center">
@ -67,14 +102,14 @@ For detailed instructions and best practices related to the installation process