First time here? Check out the FAQ!

0

Problem with undistortPoints() function in pose estimation of image

I have written about my task here. I have a set of images with known pose which were used for scene reconstruction and some query image from the same space without pose. I need to calculate the pose of the query image. I solved this problem using essential matrix. Here is a code

Mat E = findEssentialMat(pts1, pts2, focal, pp, FM_RANSAC, F_DIST, F_CONF, mask);
// Read pose for view image
Mat R, t; //, mask; 
recoverPose(E, pts1, pts2, R, t, focal, pp, mask);

The only problem is that OpenCv documentation states that the findEssentialMat function assumes that points1 and points2 are feature points from cameras with the same camera intrinsic matrix. That's not a case for us - images of scene and query image can be captured by cameras with different intrinsics. I suppose to use this undistortPoints() function. According to documentation the undistortPoints() function takes two important parameters distCoeffs and cameraMatrix. Both images of the scene and query image have calibration parameters associated (fx, fy, cx, cy). I obtain cameraMatrix parameter this way:

Mat K_v = (Mat_<double>(3, 3) << 
 fx, 0, cx, 
 0, fy, cy, 
 0, 0, 1, CV_64F);

Is this correct? Moreover I need to get somewhere distCoeffs. So how I can obtain this distortion coefficients for images of the scene and query image? Probably I should solve it another way?

Copyright OpenCV foundation, 2012-2018. Content on this site is licensed under a Creative Commons Attribution Share Alike 3.0 license.
about | faq | help | privacy policy | terms of service
Powered by Askbot version 0.10.2
Please note: OpenCV answers requires javascript to work properly, please enable javascript in your browser, here is how
gravatar image ( 2025年09月06日 07:06:47 -0600 )edit
×

AltStyle によって変換されたページ (->オリジナル) /