Merge pull request #24715 from AleksandrPanov:update_android_opencl_sample

Update Android OpenCL sample #24715

Update Android OpenCL sample and tutorial text.

### Pull Request Readiness Checklist

See details at https://github.com/opencv/opencv/wiki/How_to_contribute#making-a-good-pull-request

- [x] I agree to contribute to the project under Apache 2 License.
- [x] To the best of my knowledge, the proposed patch is not based on a code under GPL or another license that is incompatible with OpenCV
- [x] The PR is proposed to the proper branch
- [x] There is a reference to the original bug report and related work
- [ ] There is accuracy test, performance test and test data in opencv_extra repository, if applicable
      Patch to opencv_extra has the same branch name.
- [ ] The feature is well documented and sample code can be built with the project CMake
pull/24730/head
Alexander Panov 11 months ago committed by GitHub
parent 3d9cb5329c
commit 9434c89ba0
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
  1. 4
      doc/tutorials/introduction/android_binary_package/android_dnn_intro.markdown
  2. 249
      doc/tutorials/introduction/android_binary_package/android_ocl_intro.markdown
  3. 6
      samples/android/tutorial-4-opencl/build.gradle.in
  4. 12
      samples/android/tutorial-4-opencl/jni/CLprocessor.cpp
  5. 15
      samples/android/tutorial-4-opencl/jni/CMakeLists.txt
  6. 2
      samples/android/tutorial-4-opencl/src/org/opencv/samples/tutorial4/MyGLSurfaceView.java
  7. 2
      samples/android/tutorial-4-opencl/src/org/opencv/samples/tutorial4/NativePart.java

@ -15,14 +15,14 @@
## Introduction
In this tutorial you'll know how to run deep learning networks on Android device
using OpenCV deep learning module.
Tutorial was written for Android Studio Android Studio 2022.2.1.
Tutorial was written for Android Studio 2022.2.1.
## Requirements
- Download and install Android Studio from https://developer.android.com/studio.
- Get the latest pre-built OpenCV for Android release from https://github.com/opencv/opencv/releases
and unpack it (for example, `opencv-4.X.Y-android-sdk.zip`).
and unpack it (for example, `opencv-4.X.Y-android-sdk.zip`, minimum version 4.9 is required).
- Download MobileNet object detection model from https://github.com/chuanqi305/MobileNet-SSD.
Configuration file `MobileNetSSD_deploy.prototxt` and model weights `MobileNetSSD_deploy.caffemodel`

@ -6,21 +6,19 @@ Use OpenCL in Android camera preview based CV application {#tutorial_android_ocl
| | |
| -: | :- |
| Original author | Andrey Pavlenko |
| Compatibility | OpenCV >= 3.0 |
@warning
This tutorial is deprecated.
| Original author | Andrey Pavlenko, Alexander Panov |
| Compatibility | OpenCV >= 4.9 |
This guide was designed to help you in use of [OpenCL ™](https://www.khronos.org/opencl/) in Android camera preview based CV application.
It was written for [Eclipse-based ADT tools](http://developer.android.com/tools/help/adt.html)
(deprecated by Google now), but it easily can be reproduced with [Android Studio](http://developer.android.com/tools/studio/index.html).
Tutorial was written for [Android Studio](http://developer.android.com/tools/studio/index.html) 2022.2.1. It was tested with Ubuntu 22.04.
This tutorial assumes you have the following installed and configured:
- JDK
- Android SDK and NDK
- Eclipse IDE with ADT and CDT plugins
- Android Studio (2022.2.1.+)
- JDK 17
- Android SDK
- Android NDK (25.2.9519653+)
- download OpenCV source code from [github](git@github.com:opencv/opencv.git) or from [releases](https://opencv.org/releases/) and build by [instruction on wiki](https://github.com/opencv/opencv/wiki/Custom-OpenCV-Android-SDK-and-AAR-package-build).
It also assumes that you are familiar with Android Java and JNI programming basics.
If you need help with anything of the above, you may refer to our @ref tutorial_android_dev_intro guide.
@ -30,6 +28,56 @@ This tutorial also assumes you have an Android operated device with OpenCL enabl
The related source code is located within OpenCV samples at
[opencv/samples/android/tutorial-4-opencl](https://github.com/opencv/opencv/tree/4.x/samples/android/tutorial-4-opencl/) directory.
How to build custom OpenCV Android SDK with OpenCL
--------------------------------------------------
1. __Assemble and configure Android OpenCL SDK.__
The JNI part of the sample depends on standard Khornos OpenCL headers, and C++ wrapper for OpenCL and libOpenCL.so.
The standard OpenCL headers may be copied from 3rdparty directory in OpenCV repository or you Linux distribution package.
C++ wrapper is available in [official Khronos reposiotry on Github](https://github.com/KhronosGroup/OpenCL-CLHPP).
Copy the header files to didicated directory in the following way:
@code{.bash}
cd your_path/ && mkdir ANDROID_OPENCL_SDK && mkdir ANDROID_OPENCL_SDK/include && cd ANDROID_OPENCL_SDK/include
cp -r path_to_opencv/opencv/3rdparty/include/opencl/1.2/CL . && cd CL
wget https://github.com/KhronosGroup/OpenCL-CLHPP/raw/main/include/CL/opencl.hpp
wget https://github.com/KhronosGroup/OpenCL-CLHPP/raw/main/include/CL/cl2.hpp
@endcode
libOpenCL.so may be provided with BSP or just downloaded from any OpenCL-cabaple Android device with relevant arhitecture.
@code{.bash}
cd your_path/ANDROID_OPENCL_SDK && mkdir lib && cd lib
adb pull /system/vendor/lib64/libOpenCL.so
@endcode
System verison of libOpenCL.so may have a lot of platform specific dependencies. `-Wl,--allow-shlib-undefined` flag allows
to ignore 3rdparty symbols if they are not used during the build.
The following CMake line allows to link the JNI part against standard OpenCL, but not include the loadLibrary into
application package. System OpenCL API is used in run-time.
@code
target_link_libraries(${target} -lOpenCL)
@endcode
2. __Build custom OpenCV Android SDK with OpenCL.__
OpenCL support (T-API) is disabled in OpenCV builds for Android OS by default.
but it's possible to rebuild locally OpenCV for Android with OpenCL/T-API enabled: use `-DWITH_OPENCL=ON` option for CMake.
You also need to specify the path to the Android OpenCL SDK: use `-DANDROID_OPENCL_SDK=path_to_your_Android_OpenCL_SDK` option for CMake.
If you are building OpenCV using `build_sdk.py` please follow [instruction on wiki](https://github.com/opencv/opencv/wiki/Custom-OpenCV-Android-SDK-and-AAR-package-build).
Set these CMake parameters in your `.config.py`, e.g. `ndk-18-api-level-21.config.py`:
@code{.py}
ABI("3", "arm64-v8a", None, 21, cmake_vars=dict('WITH_OPENCL': 'ON', 'ANDROID_OPENCL_SDK': 'path_to_your_Android_OpenCL_SDK'))
@endcode
If you are building OpenCV using cmake/ninja, use this bash script (set your NDK_VERSION and your paths instead of examples of paths):
@code{.bash}
cd path_to_opencv && mkdir build && cd build
export NDK_VERSION=25.2.9519653
export ANDROID_SDK=/home/user/Android/Sdk/
export ANDROID_OPENCL_SDK=/path_to_ANDROID_OPENCL_SDK/
export ANDROID_HOME=$ANDROID_SDK
export ANDROID_NDK_HOME=$ANDROID_SDK/ndk/$NDK_VERSION/
cmake -GNinja -DCMAKE_TOOLCHAIN_FILE=$ANDROID_NDK_HOME/build/cmake/android.toolchain.cmake -DANDROID_STL=c++_shared -DANDROID_NATIVE_API_LEVEL=24
-DANDROID_SDK=$ANDROID_SDK -DANDROID_NDK=$ANDROID_NDK_HOME -DBUILD_JAVA=ON -DANDROID_HOME=$ANDROID_SDK -DBUILD_ANDROID_EXAMPLES=ON
-DINSTALL_ANDROID_EXAMPLES=ON -DANDROID_ABI=arm64-v8a -DWITH_OPENCL=ON -DANDROID_OPENCL_SDK=$ANDROID_OPENCL_SDK ..
@endcode
Preface
-------
@ -97,74 +145,16 @@ public class Tutorial4Activity extends Activity {
And a minimal `View` class respectively:
@code{.java}
public class MyGLSurfaceView extends GLSurfaceView {
MyGLRendererBase mRenderer;
public MyGLSurfaceView(Context context) {
super(context);
if(android.os.Build.VERSION.SDK_INT >= 21)
mRenderer = new Camera2Renderer(this);
else
mRenderer = new CameraRenderer(this);
setEGLContextClientVersion(2);
setRenderer(mRenderer);
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
}
@Override
public void surfaceCreated(SurfaceHolder holder) {
super.surfaceCreated(holder);
}
@Override
public void surfaceDestroyed(SurfaceHolder holder) {
super.surfaceDestroyed(holder);
}
@Override
public void surfaceChanged(SurfaceHolder holder, int format, int w, int h) {
super.surfaceChanged(holder, format, w, h);
}
@Override
public void onResume() {
super.onResume();
mRenderer.onResume();
}
@Override
public void onPause() {
mRenderer.onPause();
super.onPause();
}
}
@endcode
@snippet samples/android/tutorial-4-opencl/src/org/opencv/samples/tutorial4/MyGLSurfaceView.java minimal_surface_view
__Note__: we use two renderer classes: one for legacy [Camera](http://developer.android.com/reference/android/hardware/Camera.html) API
@note we use two renderer classes: one for legacy [Camera](http://developer.android.com/reference/android/hardware/Camera.html) API
and another for modern [Camera2](http://developer.android.com/reference/android/hardware/camera2/package-summary.html).
A minimal `Renderer` class can be implemented in Java (OpenGL ES 2.0 [available](http://developer.android.com/reference/android/opengl/GLES20.html) in Java),
but since we are going to modify the preview texture with OpenCL let's move OpenGL stuff to JNI.
Here is a simple Java wrapper for our JNI stuff:
@code{.java}
public class NativeGLRenderer {
static
{
System.loadLibrary("opencv_java4"); // comment this when using OpenCV Manager
System.loadLibrary("JNIrender");
}
public static native int initGL();
public static native void closeGL();
public static native void drawFrame();
public static native void changeSize(int width, int height);
}
@endcode
@snippet samples/android/tutorial-4-opencl/src/org/opencv/samples/tutorial4/NativePart.java native_part
Since `Camera` and `Camera2` APIs differ significantly in camera setup and control, let's create a base class for the two corresponding renderers:
@ -275,126 +265,21 @@ After that we can read (_copy_) pixel data from C/C++ via `glReadPixels()` and w
Also that `GL_TEXTURE_2D` texture can be shared with OpenCL without copying, but we have to create OpenCL context with special way for that:
@code{.cpp}
void initCL()
{
EGLDisplay mEglDisplay = eglGetCurrentDisplay();
if (mEglDisplay == EGL_NO_DISPLAY)
LOGE("initCL: eglGetCurrentDisplay() returned 'EGL_NO_DISPLAY', error = %x", eglGetError());
EGLContext mEglContext = eglGetCurrentContext();
if (mEglContext == EGL_NO_CONTEXT)
LOGE("initCL: eglGetCurrentContext() returned 'EGL_NO_CONTEXT', error = %x", eglGetError());
cl_context_properties props[] =
{ CL_GL_CONTEXT_KHR, (cl_context_properties) mEglContext,
CL_EGL_DISPLAY_KHR, (cl_context_properties) mEglDisplay,
CL_CONTEXT_PLATFORM, 0,
0 };
try
{
cl::Platform p = cl::Platform::getDefault();
std::string ext = p.getInfo<CL_PLATFORM_EXTENSIONS>();
if(ext.find("cl_khr_gl_sharing") == std::string::npos)
LOGE("Warning: CL-GL sharing isn't supported by PLATFORM");
props[5] = (cl_context_properties) p();
theContext = cl::Context(CL_DEVICE_TYPE_GPU, props);
std::vector<cl::Device> devs = theContext.getInfo<CL_CONTEXT_DEVICES>();
LOGD("Context returned %d devices, taking the 1st one", devs.size());
ext = devs[0].getInfo<CL_DEVICE_EXTENSIONS>();
if(ext.find("cl_khr_gl_sharing") == std::string::npos)
LOGE("Warning: CL-GL sharing isn't supported by DEVICE");
theQueue = cl::CommandQueue(theContext, devs[0]);
// ...
}
catch(cl::Error& e)
{
LOGE("cl::Error: %s (%d)", e.what(), e.err());
}
catch(std::exception& e)
{
LOGE("std::exception: %s", e.what());
}
catch(...)
{
LOGE( "OpenCL info: unknown error while initializing OpenCL stuff" );
}
LOGD("initCL completed");
}
@endcode
@note To build this JNI code you need __OpenCL 1.2__ headers from [Khronos web site](https://www.khronos.org/registry/cl/api/1.2/) and
the __libOpenCL.so__ downloaded from the device you'll run the application.
@snippet samples/android/tutorial-4-opencl/jni/CLprocessor.cpp init_opencl
Then the texture can be wrapped by a `cl::ImageGL` object and processed via OpenCL calls:
@code{.cpp}
cl::ImageGL imgIn (theContext, CL_MEM_READ_ONLY, GL_TEXTURE_2D, 0, texIn);
cl::ImageGL imgOut(theContext, CL_MEM_WRITE_ONLY, GL_TEXTURE_2D, 0, texOut);
std::vector < cl::Memory > images;
images.push_back(imgIn);
images.push_back(imgOut);
theQueue.enqueueAcquireGLObjects(&images);
theQueue.finish();
cl::Kernel Laplacian = ...
Laplacian.setArg(0, imgIn);
Laplacian.setArg(1, imgOut);
theQueue.finish();
theQueue.enqueueNDRangeKernel(Laplacian, cl::NullRange, cl::NDRange(w, h), cl::NullRange);
theQueue.finish();
theQueue.enqueueReleaseGLObjects(&images);
theQueue.finish();
@endcode
@snippet samples/android/tutorial-4-opencl/jni/CLprocessor.cpp process_pure_opencl
### OpenCV T-API
But instead of writing OpenCL code by yourselves you may want to use __OpenCV T-API__ that calls OpenCL implicitly.
All that you need is to pass the created OpenCL context to OpenCV (via `cv::ocl::attachContext()`) and somehow wrap OpenGL texture with `cv::UMat`.
Unfortunately `UMat` keeps OpenCL _buffer_ internally, that can't be wrapped over either OpenGL _texture_ or OpenCL _image_ - so we have to copy image data here:
@code{.cpp}
cl::ImageGL imgIn (theContext, CL_MEM_READ_ONLY, GL_TEXTURE_2D, 0, tex);
std::vector < cl::Memory > images(1, imgIn);
theQueue.enqueueAcquireGLObjects(&images);
theQueue.finish();
cv::UMat uIn, uOut, uTmp;
cv::ocl::convertFromImage(imgIn(), uIn);
theQueue.enqueueReleaseGLObjects(&images);
cv::Laplacian(uIn, uTmp, CV_8U);
cv:multiply(uTmp, 10, uOut);
cv::ocl::finish();
cl::ImageGL imgOut(theContext, CL_MEM_WRITE_ONLY, GL_TEXTURE_2D, 0, tex);
images.clear();
images.push_back(imgOut);
theQueue.enqueueAcquireGLObjects(&images);
cl_mem clBuffer = (cl_mem)uOut.handle(cv::ACCESS_READ);
cl_command_queue q = (cl_command_queue)cv::ocl::Queue::getDefault().ptr();
size_t offset = 0;
size_t origin[3] = { 0, 0, 0 };
size_t region[3] = { w, h, 1 };
CV_Assert(clEnqueueCopyBufferToImage (q, clBuffer, imgOut(), offset, origin, region, 0, NULL, NULL) == CL_SUCCESS);
theQueue.enqueueReleaseGLObjects(&images);
cv::ocl::finish();
@endcode
- @note We have to make one more image data copy when placing back the modified image to the original OpenGL texture via OpenCL image wrapper.
- @note By default the OpenCL support (T-API) is disabled in OpenCV builds for Android OS (so it's absent in official packages as of version 3.0),
but it's possible to rebuild locally OpenCV for Android with OpenCL/T-API enabled: use `-DWITH_OPENCL=YES` option for CMake.
@code{.cmd}
cd opencv-build-android
path/to/cmake.exe -GNinja -DCMAKE_MAKE_PROGRAM="path/to/ninja.exe" -DCMAKE_TOOLCHAIN_FILE=path/to/opencv/platforms/android/android.toolchain.cmake -DANDROID_ABI="armeabi-v7a with NEON" -DCMAKE_BUILD_WITH_INSTALL_RPATH=ON path/to/opencv
path/to/ninja.exe install/strip
@endcode
To use your own modified `libopencv_java4.so` you have to keep inside your APK, not to use OpenCV Manager and load it manually via `System.loadLibrary("opencv_java4")`.
@snippet samples/android/tutorial-4-opencl/jni/CLprocessor.cpp process_tapi
@note We have to make one more image data copy when placing back the modified image to the original OpenGL texture via OpenCL image wrapper.
Performance notes
-----------------

@ -14,11 +14,13 @@ android {
cmake {
if (gradle.opencv_source == "sdk_path") {
arguments "-DOpenCV_DIR=" + project(':opencv').projectDir + "/@ANDROID_PROJECT_JNI_PATH@",
"-DOPENCV_FROM_SDK=TRUE"@OPENCV_ANDROID_CMAKE_EXTRA_ARGS@
"-DOPENCV_FROM_SDK=TRUE",
"-DANDROID_OPENCL_SDK=@ANDROID_OPENCL_SDK@" @OPENCV_ANDROID_CMAKE_EXTRA_ARGS@
} else {
arguments "-DOPENCV_VERSION_MAJOR=@OPENCV_VERSION_MAJOR@",
"-DOPENCV_FROM_SDK=FALSE"@OPENCV_ANDROID_CMAKE_EXTRA_ARGS@
"-DOPENCV_FROM_SDK=FALSE",
"-DANDROID_OPENCL_SDK=@ANDROID_OPENCL_SDK@" @OPENCV_ANDROID_CMAKE_EXTRA_ARGS@
}
targets "JNIpart"
}

@ -1,7 +1,7 @@
#ifdef OPENCL_FOUND
#define __CL_ENABLE_EXCEPTIONS
#define CL_USE_DEPRECATED_OPENCL_1_1_APIS /*let's give a chance for OpenCL 1.1 devices*/
#include <CL/cl.hpp>
#include <CL/opencl.hpp>
#endif
#include <GLES2/gl2.h>
@ -87,10 +87,11 @@ cl::CommandQueue theQueue;
cl::Program theProgB2B, theProgI2B, theProgI2I;
bool haveOpenCL = false;
//![init_opencl]
int initCL()
{
dumpCLinfo();
LOGE("initCL: start initCL");
EGLDisplay mEglDisplay = eglGetCurrentDisplay();
if (mEglDisplay == EGL_NO_DISPLAY)
LOGE("initCL: eglGetCurrentDisplay() returned 'EGL_NO_DISPLAY', error = %x", eglGetError());
@ -156,6 +157,7 @@ int initCL()
else
return 4;
}
//![init_opencl]
#define GL_TEXTURE_2D 0x0DE1
void procOCL_I2I(int texIn, int texOut, int w, int h)
@ -168,6 +170,7 @@ void procOCL_I2I(int texIn, int texOut, int w, int h)
}
LOGD("procOCL_I2I(%d, %d, %d, %d)", texIn, texOut, w, h);
//![process_pure_opencl]
cl::ImageGL imgIn (theContext, CL_MEM_READ_ONLY, GL_TEXTURE_2D, 0, texIn);
cl::ImageGL imgOut(theContext, CL_MEM_WRITE_ONLY, GL_TEXTURE_2D, 0, texOut);
std::vector < cl::Memory > images;
@ -195,6 +198,7 @@ void procOCL_I2I(int texIn, int texOut, int w, int h)
theQueue.enqueueReleaseGLObjects(&images);
theQueue.finish();
LOGD("enqueueReleaseGLObjects() costs %d ms", getTimeInterval(t));
//![process_pure_opencl]
}
void procOCL_OCV(int texIn, int texOut, int w, int h)
@ -206,6 +210,7 @@ void procOCL_OCV(int texIn, int texOut, int w, int h)
return;
}
//![process_tapi]
int64_t t = getTimeMs();
cl::ImageGL imgIn (theContext, CL_MEM_READ_ONLY, GL_TEXTURE_2D, 0, texIn);
std::vector < cl::Memory > images(1, imgIn);
@ -232,11 +237,12 @@ void procOCL_OCV(int texIn, int texOut, int w, int h)
cl_command_queue q = (cl_command_queue)cv::ocl::Queue::getDefault().ptr();
size_t offset = 0;
size_t origin[3] = { 0, 0, 0 };
size_t region[3] = { w, h, 1 };
size_t region[3] = { (size_t)w, (size_t)h, 1 };
CV_Assert(clEnqueueCopyBufferToImage (q, clBuffer, imgOut(), offset, origin, region, 0, NULL, NULL) == CL_SUCCESS);
theQueue.enqueueReleaseGLObjects(&images);
cv::ocl::finish();
LOGD("uploading results to texture costs %d ms", getTimeInterval(t));
//![process_tapi]
}
#else
int initCL()

@ -25,11 +25,18 @@ target_link_libraries(${target} ${ANDROID_OPENCV_COMPONENTS} -lGLESv2 -lEGL -llo
if(OpenCL_FOUND)
include_directories(${OpenCL_INCLUDE_DIRS})
target_link_libraries(${OpenCL_LIBRARIES})
target_link_libraries(${target} ${OpenCL_LIBRARIES})
add_definitions("-DOPENCL_FOUND")
elseif(DEFINED ANDROID_OPENCL_SDK)
elseif(NOT ("${ANDROID_OPENCL_SDK}" STREQUAL ""))
include_directories(${ANDROID_OPENCL_SDK}/include)
link_directories(${ANDROID_OPENCL_SDK}/lib/${ANDROID_NDK_ABI_NAME})
target_link_libraries(-lOpenCL)
link_directories(${ANDROID_OPENCL_SDK}/lib)
target_link_directories(${target} PRIVATE ${ANDROID_OPENCL_SDK}/lib)
set_target_properties(${target} PROPERTIES LINK_FLAGS "-Wl,--allow-shlib-undefined")
target_link_libraries(${target} -lOpenCL)
add_definitions("-DOPENCL_FOUND")
add_definitions("-DCL_HPP_MINIMUM_OPENCL_VERSION=120")
add_definitions("-DCL_HPP_TARGET_OPENCL_VERSION=120")
add_definitions("-DCL_HPP_ENABLE_PROGRAM_CONSTRUCTION_FROM_ARRAY_COMPATIBILITY")
endif()

@ -13,6 +13,7 @@ import android.view.SurfaceHolder;
import android.widget.TextView;
import android.widget.Toast;
//![minimal_surface_view]
public class MyGLSurfaceView extends CameraGLSurfaceView implements CameraGLSurfaceView.CameraTextureListener {
static final String LOGTAG = "MyGLSurfaceView";
@ -111,3 +112,4 @@ public class MyGLSurfaceView extends CameraGLSurfaceView implements CameraGLSurf
return true;
}
}
//![minimal_surface_view]

@ -1,5 +1,6 @@
package org.opencv.samples.tutorial4;
//![native_part]
public class NativePart {
static
{
@ -17,3 +18,4 @@ public class NativePart {
public static native void closeCL();
public static native void processFrame(int tex1, int tex2, int w, int h, int mode);
}
//![native_part]

Loading…
Cancel
Save