-# Then if we have an image with white background, it is good to tranform it black. This will help us to desciminate the foreground objects easier when we will apply the Distance Transform:
-# Then if we have an image with a white background, it is good to transform it to black. This will help us to descriminate the foreground objects easier when we will apply the Distance Transform:
-# Now we tranfrom our new sharped source image to a grayscale and a binary one, respectively:
-# Now we transfrom our new sharped source image to a grayscale and a binary one, respectively:
@snippet samples/cpp/tutorial_code/ImgTrans/imageSegmentation.cpp bin
![](images/bin.jpeg)
-# We are ready now to apply the Distance Tranform on the binary image. Moreover, we normalize the output image in order to be able visualize and threshold the result:
-# We are ready now to apply the Distance Transform on the binary image. Moreover, we normalize the output image in order to be able visualize and threshold the result:
All specialized `ocl` implemetations has been hidden behind general C++ algorithm interface. Now the function execution path can be selected dynamically at runtime: CPU or OpenCL; this mechanism is also called "Transparent API".
New class cv::UMat is intended to hide data exchange with OpenCL device in a convinient way.
New class cv::UMat is intended to hide data exchange with OpenCL device in a convenient way.
Following example illustrate API modifications (from [OpenCV site](http://opencv.org/platforms/opencl.html)):
@ -10,7 +10,7 @@ To support this tutorial, several official OpenCV applications will be used: [op
### Important notes
- If you come accross any tutorial mentioning the old opencv_haartraining tool <i>(which is deprecated and still using the OpenCV1.x interface)</i>, then please ignore that tutorial and stick to the opencv_traincascade tool. This tool is a newer version, written in C++ in accordance to the OpenCV 2.x and OpenCV 3.x API. The opencv_traincascade supports both HAAR like wavelet features @cite Viola01 and LBP (Local Binary Patterns) @cite Liao2007 features. LBP features yield integer precision in contrast to HAAR features, yielding floating point precision, so both training and detection with LBP are several times faster then with HAAR features. Regarding the LBP and HAAR detection quality, it mainly depends on the training data used and the training parameters selected. It's possible to train a LBP-based classifier that will provide almost the same quality as HAAR-based one, within a percentage of the training time.
- If you come across any tutorial mentioning the old opencv_haartraining tool <i>(which is deprecated and still using the OpenCV1.x interface)</i>, then please ignore that tutorial and stick to the opencv_traincascade tool. This tool is a newer version, written in C++ in accordance to the OpenCV 2.x and OpenCV 3.x API. The opencv_traincascade supports both HAAR like wavelet features @cite Viola01 and LBP (Local Binary Patterns) @cite Liao2007 features. LBP features yield integer precision in contrast to HAAR features, yielding floating point precision, so both training and detection with LBP are several times faster then with HAAR features. Regarding the LBP and HAAR detection quality, it mainly depends on the training data used and the training parameters selected. It's possible to train a LBP-based classifier that will provide almost the same quality as HAAR-based one, within a percentage of the training time.
- The newer cascade classifier detection interface from OpenCV 2.x and OpenCV 3.x (@ref cv::CascadeClassifier) supports working with both old and new model formats. opencv_traincascade can even save (export) a trained cascade in the older format if for some reason you are stuck using the old interface. At least training the model could then be done in the most stable interface.
@ -69,7 +69,7 @@ Command line arguments:
- `-num <number_of_samples>` : Number of positive samples to generate.
- `-bgcolor <background_color>` : Background color (currently grayscale images are assumed); the background color denotes the transparent color. Since there might be compression artifacts, the amount of color tolerance can be specified by -bgthresh. All pixels withing bgcolor-bgthresh and bgcolor+bgthresh range are interpreted as transparent.
- `-bgcolor <background_color>` : Background color (currently grayscale images are assumed); the background color denotes the transparent color. Since there might be compression artifacts, the amount of color tolerance can be specified by -bgthresh. All pixels within bgcolor-bgthresh and bgcolor+bgthresh range are interpreted as transparent.
- `-bgthresh <background_color_threshold>`
- `-inv` : If specified, colors will be inverted.
@ -188,7 +188,7 @@ Training is finished and you can test your cascade classifier!
Visualising Cascade Classifiers
-------------------------------
From time to time it can be usefull to visualise the trained cascade, to see which features it selected and how complex its stages are. For this OpenCV supplies a opencv_visualisation application. This application has the following commands:
From time to time it can be useful to visualise the trained cascade, to see which features it selected and how complex its stages are. For this OpenCV supplies a opencv_visualisation application. This application has the following commands:
- `--image`<b>(required)</b> : path to a reference image for your object model. This should be an annotation with dimensions [`-w`,`-h`] as passed to both opencv_createsamples and opencv_traincascade application.
- `--model`<b>(required)</b> : path to the trained model, which should be in the folder supplied to the `-data` parameter of the opencv_traincascade application.
parser.add_argument('--no_ccache',action="store_true",help="Do not use ccache during library build")
parser.add_argument('--extra_pack',action='append',help="provide extra OpenCV libraries for Manager apk in form <version>:<path-to-native-libs>, for example '2.4.11:unpacked/sdk/native/libs'")
@ -57,7 +57,7 @@ Generally all that is required is the standard Maven command:
One of the first things the build will do is check the required native dependencies. The Maven build indicates the status of the required dependencies and will fail at this point if any are missing. Install using the package manager e.g. aptitude or apt-get, and restart the build with the above command.
Once the build succesfully completes the OSGi compatible artifacts are available as described above in 'Build Directory'.
Once the build successfully completes the OSGi compatible artifacts are available as described above in 'Build Directory'.
#### 3.4 - ARM 32-bit Architecture - Raspbian Distribution
Similar to the x86 architecture the native dependencies are first checked so install any that are missing, however at the time of writing there are no official `libtbb2` and `libtbb-dev` packages in Raspbian. Version 4.4.3 of Intel's Thread Building Blocks library are available [here](http://www.javatechnics.com/thread-building-blocks-tbb-4-4-3-for-raspbian) as a Raspbian-compatible Debian packages.