@ -2495,10 +2495,10 @@ be floating-point (single or double precision).
@ param points2 Array of the second image points of the same size and format as points1 .
@ param cameraMatrix Camera intrinsic matrix \ f $ \ cameramatrix { A } \ f $ .
Note that this function assumes that points1 and points2 are feature points from cameras with the
same camera intrinsic matrix . If this assumption does not hold for your use case , use
# undistortPoints with `P = cv::NoArray()` for both cameras to transform image points
to normalized image coordinates , which are valid for the identity camera intrinsic matrix . When
passing these coordinates , pass the identity matrix for this parameter .
same camera intrinsic matrix . If this assumption does not hold for your use case , use another
function overload or # undistortPoints with ` P = cv : : NoArray ( ) ` for both cameras to transform image
points to normalized image coordinates , which are valid for the identity camera intrinsic matrix .
When passing these coordinates , pass the identity matrix for this parameter .
@ param method Method for computing an essential matrix .
- @ ref RANSAC for the RANSAC algorithm .
- @ ref LMEDS for the LMedS algorithm .
@ -2591,22 +2591,12 @@ Mat findEssentialMat(
@ param points1 Array of N ( N \ > = 5 ) 2 D points from the first image . The point coordinates should
be floating - point ( single or double precision ) .
@ param points2 Array of the second image points of the same size and format as points1 .
@ param cameraMatrix1 Camera matrix \ f $ K = \ vecthreethree { f_x } { 0 } { c_x } { 0 } { f_y } { c_y } { 0 } { 0 } { 1 } \ f $ .
Note that this function assumes that points1 and points2 are feature points from cameras with the
same camera matrix . If this assumption does not hold for your use case , use
# undistortPoints with `P = cv::NoArray()` for both cameras to transform image points
to normalized image coordinates , which are valid for the identity camera matrix . When
passing these coordinates , pass the identity matrix for this parameter .
@ param cameraMatrix2 Camera matrix \ f $ K = \ vecthreethree { f_x } { 0 } { c_x } { 0 } { f_y } { c_y } { 0 } { 0 } { 1 } \ f $ .
Note that this function assumes that points1 and points2 are feature points from cameras with the
same camera matrix . If this assumption does not hold for your use case , use
# undistortPoints with `P = cv::NoArray()` for both cameras to transform image points
to normalized image coordinates , which are valid for the identity camera matrix . When
passing these coordinates , pass the identity matrix for this parameter .
@ param distCoeffs1 Input vector of distortion coefficients
@ param cameraMatrix1 Camera matrix for the first camera \ f $ K = \ vecthreethree { f_x } { 0 } { c_x } { 0 } { f_y } { c_y } { 0 } { 0 } { 1 } \ f $ .
@ param cameraMatrix2 Camera matrix for the second camera \ f $ K = \ vecthreethree { f_x } { 0 } { c_x } { 0 } { f_y } { c_y } { 0 } { 0 } { 1 } \ f $ .
@ param distCoeffs1 Input vector of distortion coefficients for the first camera
\ f $ ( k_1 , k_2 , p_1 , p_2 [ , k_3 [ , k_4 , k_5 , k_6 [ , s_1 , s_2 , s_3 , s_4 [ , \ tau_x , \ tau_y ] ] ] ] ) \ f $
of 4 , 5 , 8 , 12 or 14 elements . If the vector is NULL / empty , the zero distortion coefficients are assumed .
@ param distCoeffs2 Input vector of distortion coefficients
@ param distCoeffs2 Input vector of distortion coefficients for the second camera
\ f $ ( k_1 , k_2 , p_1 , p_2 [ , k_3 [ , k_4 , k_5 , k_6 [ , s_1 , s_2 , s_3 , s_4 [ , \ tau_x , \ tau_y ] ] ] ] ) \ f $
of 4 , 5 , 8 , 12 or 14 elements . If the vector is NULL / empty , the zero distortion coefficients are assumed .
@ param method Method for computing an essential matrix .