Test Direct Linear Transformation in real image.(matlab source)


I tested the Direct Linear Transformation.
But the method doesn't run well in real image.


I tested below process.
1.Left image coordinate, Right image coordinate in real image. It is also matched point.
2.Get R,T between Left, Right Camera. and Calibration matrix from file.
3.make projection P matrix, (P1,P2)
4.Get World 3D coordinate using DLT.
5.Again, Get image coordinate from World 3D coordinate.


The problem..
recovered image point and input image point is not matched.

Below is the matlab code.
And You can down load. here->< Entire Source Code >

--------------------------------------------------------------------------
%%
clc;
clear all;
close all;

%% 데이터 로딩
%% Data Loading
load rotation_matrices.txt
load translation_vectors.txt
load intrinsic_matrix.txt
load distortion_coeffs.txt

R = rotation_matrices;

%R1은 Left카메라의 패턴판으로 부터의 Rotation Matrix
%R1 is Rotation Matrix of Left Camera from Pattern board. Pattern Board has origin
%coordinate (0,0,0)
R1 = reshape(R(19,:),3,3);

%R2 is Left Camera Rotation Matrix.
R2 = reshape(R(20,:),3,3);
%T1 is Left Camera Translation Matrix
T1 = translation_vectors(19,:)';
%T1 is right Camera Translation Matrix
T2 = translation_vectors(20,:)';
K = intrinsic_matrix;
%Load Matched Coordinate
load pattern19.txt;
load pattern20.txt;
m1 = pattern19';
m2 = pattern20';




%% Real R,T 만들기
%% Make Real R,T that is relation between Left, Right Camera
% R,T is made by R1,R2 and T1, T1
RealT = T2-T1; %T는 그냥 빼면 된다.
RealR = R2*inv(R1);
RealA=rodrigues(RealR)*180/pi; %This is Angle




%% P1, P2 만들기
% Make Projection matrix
P1 = K*[eye(3) zeros(3,1)]; %P1 is reference Camera so P1=K[I:O]
P2 = K*[RealR RealT];




%% 3차원 점 만들기
%P1, P2를 이용한 3차원 점 복원
W=[];




%Make 3D coordinate using Direct Linear Transformation
%W is wrold coordinate.
for i=1:5
A=[ m1(1,i)*P1(3,:) - P1(1,:);
m1(2,i)*P1(3,:) - P1(2,:);
m2(1,i)*P2(3,:) - P2(1,:);
m2(2,i)*P2(3,:) - P2(2,:)];
A(1,:) = A(1,:)/norm(A(1,:));
A(2,:) = A(2,:)/norm(A(2,:));
A(3,:) = A(3,:)/norm(A(3,:));
A(4,:) = A(4,:)/norm(A(4,:));



[u d v] = svd(A);
W=[W v(:,4)/v(4,4)];
end




%% 다시 3차원 점에서 픽셀 점 만들기
% Now, make image coordiante using P1, P2 from W matrix.
reip1 = P1*W;
% reip1 is recovered image coordiante
reip1 = [reip1(1,:)./reip1(3,:); reip1(2,:)./reip1(3,:)] %3차원에서 복원된 이미지 좌표
% m1 is origin image coordinate
m1(:,1:5) %원래 이미지 좌표
reip2 = P2*W;
%reip2 is recovered image coordiante
reip2 = [reip2(1,:)./reip2(3,:); reip2(2,:)./reip2(3,:)] %3차원에서 복원된 이미지 좌표
%m2 is origin image coordinate
m2(:,1:5) %원래 이미지 좌표
-------------------------------------------------------------------------



But, Why reip1 != m1 and reip2 != m2 ??
I cann't know the reason..
Please discuss with me the reason~ ^^


(Please understand my bad english ability. If you point out my mistake, I would correct pleasurably. Thank you!!)

Comments

Popular posts from this blog

(OpenCV Study) Background subtractor MOG, MOG2, GMG example source code (BackgroundSubtractorMOG, BackgroundSubtractorMOG2, BackgroundSubtractorGMG)

OpenCV Stitching example (Stitcher class, Panorama)

Example source code of extract HOG feature from images, save descriptor values to xml file, using opencv (using HOGDescriptor )

Real-time N camera stitching Class.

Optical Flow sample source code using OpenCV

OpenCV Drawing Example, (line, circle, rectangle, ellipse, polyline, fillConvexPoly, putText, drawContours)

Video Stabilization example source code, (using cvFindHomography, cvWarpPerspective functions in openCV)

SICK LMS511 sensor data acquisition interface (source code, C++/MFC)

8 point algorithm (Matlab source code) / The method to get the Fundamental Matrix and the Essential matrix

Image warping (using opencv findHomography, warpPerspective)