diff --git a/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.markdown b/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.markdown index ca7853d96a..d8ba8f856d 100644 --- a/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.markdown +++ b/doc/py_tutorials/py_feature2d/py_matcher/py_matcher.markdown @@ -38,13 +38,13 @@ best matches. There is also **cv.drawMatchesKnn** which draws all the k best mat will draw two match-lines for each keypoint. So we have to pass a mask if we want to selectively draw it. -Let's see one example for each of SURF and ORB (Both use different distance measurements). +Let's see one example for each of SIFT and ORB (Both use different distance measurements). ### Brute-Force Matching with ORB Descriptors Here, we will see a simple example on how to match features between two images. In this case, I have a queryImage and a trainImage. We will try to find the queryImage in trainImage using feature -matching. ( The images are /samples/c/box.png and /samples/c/box_in_scene.png) +matching. ( The images are /samples/data/box.png and /samples/data/box_in_scene.png) We are using ORB descriptors to match features. So let's start with loading images, finding descriptors etc.