Feature Match ing

29100 ワード

http://docs.opencv.org/3.0-beta/doc/py_tutorils/py_フィーチャー2 d/py_matcher/py_matcher.
Goal
In this chapter
We will see how to match feature s in one mage with others.We will use the Brute-Force macher and FLANN Match-in OpenCV Baics of Brute-Force Match
Brute-Force matcher is simple.It tarees the descriptor of one feature in first set and is matched with all other feature s in second second second used some distance cance.And the closest is returned.
For BF matcher,first we have to create the BFMatch object using cv 2.BFMatch().It tarst two optional params.First one is normType.It specifees the distance meass rement to be used.By default,it is cv 2.NORM_L 2.It is good for SIFT、SURF etc(cv 2.NORM uL 1 is also there).For binary string based descriptors like ORB,BREIEF,BRIKetc, cv 2.NORM_HAMMINGshound be used,which used Hamming distance as meas rement.If ORB is using WTA_K == 3 or 4, cv 2.NORM_HAMMING 2 shuld be used.
Second param is book variable、 cross Check which is false by default.If it is true,Match returns only those matches with value(i,j)such that i-th descriptor in set A has j-th descriptor in set B as the best match and vice vice-versathe.Thature.and is a good alternative to ratio test proposed by D.Lowe in SIFT paper.
Onece it is created,two importent methods are BFMatch.match() and BFMatch.knnMatch().First one returns the best match.Second method returns k ベストマッチches where k is specified by the user.It may be useful when we need to do additional work on that.
Like we used cv.2.drawKeypenints()to draw keypyronts、 cv 2.drawMatch() helps us to draw the matches.It stacks two margs horizontally and draw lineas from first image to second image show best matches.The is also cv 2.drawMatch esKnn which draws all the k best matches.If k=2,it will draw two match-inds for each keyboot.So we have to pass a mask if went to selectively draw it.
Let’s see one example for each of SURF and ORB(Both use different distance meass rements)
Brute-Force Match with ORB Descriptors
Here,we will see a simple example on how to match feature s between tmages.In this case,I have a queryImage and a trinImage.We will try to find the queryImage in trinmage in tringe mage feature mage mage mage.mage.mage.mage. /samples/c/box.png and /samples/c/ボックスコクーン.png)
We are using SIFT descriptors to match feature s.So let’s start with loading images,finding descriptors etc.
import numpy as np
import cv2
from matplotlib import pyplot as plt

img1 = cv2.imread('box.png',0)          # queryImage
img2 = cv2.imread('box_in_scene.png',0) # trainImage

# Initiate SIFT detector
orb = cv2.ORB()

# find the keypoints and descriptors with SIFT
kp1, des1 = orb.detectAndCompute(img1,None)
kp2, des2 = orb.detectAndCompute(img2,None)
Next we create a BFMatch object with distance meass rement cv 2.NORM_HAMMING (since we are using ORB)and cross Check is switch on for better result.The n we use Match.match()method to get the best matches in two mages.We sort them in ascending order of their distance so that best matches(with low distance)come front.The front.
# create BFMatcher object
bf = cv2.BFMatcher(cv2.NORM_HAMMING, crossCheck=True)

# Match descriptors.
matches = bf.match(des1,des2)

# Sort them in the order of their distance.
matches = sorted(matches, key = lambda x:x.distance)

# Draw first 10 matches.
img3 = cv2.drawMatches(img1,kp1,img2,kp2,matches[:10], flags=2)

plt.imshow(img3),plt.show()
Below is the result I got:
Feature Matching_第1张图片
What is this Match Object?
The reult of matches = bf.match(des 1,des 2) line is a list of Datch object.This Datch object has follwing atributes:
DMtch.distance - Distance between descriptors.The lower,the better it is.DMtch.trinIdx - Index of the descriptor in descriptors DMtch.queryIdx - Index of the descriptor in query descriptors DMtch.imgIdx - Index of the trainイメージBrute-Force Match with SIFT Descriptors and Ratio Test
This time,we will use BFMatch.knnMatch() to get k best matches.In this example,we will take=2 so that we can appley ratio test explined by D.Lowe in paper.
import numpy as np
import cv2
from matplotlib import pyplot as plt

img1 = cv2.imread('box.png',0)          # queryImage
img2 = cv2.imread('box_in_scene.png',0) # trainImage

# Initiate SIFT detector
sift = cv2.SIFT()

# find the keypoints and descriptors with SIFT
kp1, des1 = sift.detectAndCompute(img1,None)
kp2, des2 = sift.detectAndCompute(img2,None)

# BFMatcher with default params
bf = cv2.BFMatcher()
matches = bf.knnMatch(des1,des2, k=2)

# Apply ratio test
good = []
for m,n in matches:
    if m.distance < 0.75*n.distance:
        good.append([m])

# cv2.drawMatchesKnn expects list of lists as matches.
img3 = cv2.drawMatchesKnn(img1,kp1,img2,kp2,good,flags=2)

plt.imshow(img3),plt.show()
See the result below:
Feature Matching_第2张图片
FLANN based Match
FLANN stands for Fast Library for Approximate Neared Neighbors.It contains a collection of algorithms optimized for fast neighbor search inarch and for higdimensional feature.It workete farte
For FLANN based matcher,we need to pass two dictiories which specifies the algorithm to be used,its related parameters etc.First one is IndexParames.For various algorithms,the information be passed is fred
index_params = dict(algorithm = FLANN_INDEX_KDTREE, trees = 5)
While using ORB,you can pass the followwing.The commentd values are recommanded as per the docs,but it didn’t provide required required requales cases.Other values worked fine.
index_params= dict(algorithm = FLANN_INDEX_LSH,
                   table_number = 6, # 12
                   key_size = 12,     # 20
                   multi_probe_level = 1) #2
Second dictorary is the Search Paraams.It specifies s the number of times the trees in the index shound be recursively trversed.Higher values gives better precision、but also more time.If You want to change vars search_パラms = ディック(checks=100)
With these informations、we are good to go.
import numpy as np
import cv2
from matplotlib import pyplot as plt

img1 = cv2.imread('box.png',0)          # queryImage
img2 = cv2.imread('box_in_scene.png',0) # trainImage

# Initiate SIFT detector
sift = cv2.SIFT()

# find the keypoints and descriptors with SIFT
kp1, des1 = sift.detectAndCompute(img1,None)
kp2, des2 = sift.detectAndCompute(img2,None)

# FLANN parameters
FLANN_INDEX_KDTREE = 0
index_params = dict(algorithm = FLANN_INDEX_KDTREE, trees = 5)
search_params = dict(checks=50)   # or pass empty dictionary

flann = cv2.FlannBasedMatcher(index_params,search_params)

matches = flann.knnMatch(des1,des2,k=2)

# Need to draw only good matches, so create a mask
matchesMask = [[0,0] for i in xrange(len(matches))]

# ratio test as per Lowe's paper
for i,(m,n) in enumerate(matches):
    if m.distance < 0.7*n.distance:
        matchesMask[i]=[1,0]

draw_params = dict(matchColor = (0,255,0),
                   singlePointColor = (255,0,0),
                   matchesMask = matchesMask,
                   flags = 0)

img3 = cv2.drawMatchesKnn(img1,kp1,img2,kp2,matches,None,**draw_params)

plt.imshow(img3,),plt.show()
See the result below:
Feature Matching_第3张图片
Additional Resource
Exercises
Help and Feedback You did not find what you were looking for?
  • Ask a question on the Q&A forum.
  • If you think something is missing or wrong in the documentation,please file a bug report.