mirror of https://github.com/opencv/opencv.git
Tag:
Branch:
Tree:
7bcb51eded
2.4
3.4
4.x
5.x
master
next
2.2
2.3.0
2.3.1
2.4.0
2.4.1
2.4.10
2.4.10.1
2.4.10.2
2.4.10.3
2.4.10.4
2.4.11
2.4.12
2.4.12.1
2.4.12.2
2.4.12.3
2.4.13
2.4.13.1
2.4.13.2
2.4.13.3
2.4.13.4
2.4.13.5
2.4.13.6
2.4.13.7
2.4.2
2.4.3
2.4.3-rc
2.4.3.1
2.4.3.2
2.4.4
2.4.4-beta
2.4.5
2.4.6
2.4.6.1
2.4.6.2
2.4.6.2-rc1
2.4.6.2r2
2.4.6.2r3
2.4.7
2.4.7-rc1
2.4.7.1
2.4.7.2
2.4.8
2.4.8.1
2.4.8.2
2.4.8.3
2.4.9
2.4.9.1
3.0-ocl-tech-preview
3.0-ocl-tp2
3.0.0
3.0.0-alpha
3.0.0-beta
3.0.0-rc1
3.1.0
3.2.0
3.2.0-rc
3.3.0
3.3.0-cvsdk
3.3.0-rc
3.3.1
3.3.1-cvsdk
3.4.0
3.4.0-rc
3.4.1
3.4.1-cvsdk
3.4.10
3.4.11
3.4.12
3.4.13
3.4.14
3.4.15
3.4.16
3.4.17
3.4.18
3.4.19
3.4.2
3.4.2-openvino
3.4.20
3.4.3
3.4.3-openvino
3.4.4
3.4.5
3.4.6
3.4.7
3.4.8
3.4.9
4.0.0
4.0.0-alpha
4.0.0-beta
4.0.0-openvino
4.0.0-rc
4.0.1
4.0.1-openvino
4.1.0
4.1.0-openvino
4.1.1
4.1.1-openvino
4.1.2
4.1.2-openvino
4.10.0
4.10.0-kleidicv
4.2.0
4.2.0-openvino
4.3.0
4.3.0-openvino
4.3.0-openvino-2020.3.0
4.4.0
4.4.0-openvino
4.5.0
4.5.0-openvino
4.5.1
4.5.1-openvino
4.5.2
4.5.2-openvino
4.5.3
4.5.3-openvino
4.5.3-openvino-2021.4.1
4.5.3-openvino-2021.4.2
4.5.4
4.5.5
4.5.5-openvino-2022.1.0
4.6.0
4.7.0
4.8.0
4.8.1
4.9.0
${ noResults }
1 Commits (7bcb51eded04cb0acb10e7eb92eb0f2c3a0329ca)
Author | SHA1 | Message | Date |
---|---|---|---|
Dmitry Matveev |
a110ede0a2
|
Merge pull request #18716 from dmatveev:dm/upstream_onnx
* G-API: Introduce ONNX backend for Inference - Basic operations are implemented (Infer, -ROI, -List, -List2); - Implemented automatic preprocessing for ONNX models; - Test suite is extended with `OPENCV_GAPI_ONNX_MODEL_PATH` env for test data (test data is an ONNX Model Zoo repo snapshot); - Fixed kernel lookup logic in core G-API: - Lookup NN kernels not in the default package, but in the associated backend's aux package. Now two NN backends can work in the same graph. - Added Infer SSD demo and a combined ONNX/IE demo; * G-API/ONNX: Fix some of CMake issues Co-authored-by: Pashchenkov, Maxim <maxim.pashchenkov@intel.com> |
4 years ago |