I tested the Direct Linear Transformation.
But the method doesn't run well in real image.
I tested below process.
1.Left image coordinate, Right image coordinate in real image. It is also matched point.
2.Get R,T between Left, Right Camera. and Calibration matrix from file.
3.make projection P matrix, (P1,P2)
4.Get World 3D coordinate using DLT.
5.Again, Get image coordinate from World 3D coordinate.
The problem..
recovered image point and input image point is not matched.
Below is the matlab code.
And You can down load. here->< Entire Source Code >
--------------------------------------------------------------------------
%% clc; clear all; close all; %% λ°μ΄ν° λ‘λ© %% Data Loading load rotation_matrices.txt load translation_vectors.txt load intrinsic_matrix.txt load distortion_coeffs.txt R = rotation_matrices; %R1μ LeftμΉ΄λ©λΌμ ν¨ν΄νμΌλ‘ λΆν°μ Rotation Matrix %R1 is Rotation Matrix of Left Camera from Pattern board. Pattern Board has origin %coordinate (0,0,0) R1 = reshape(R(19,:),3,3); %R2 is Left Camera Rotation Matrix. R2 = reshape(R(20,:),3,3); %T1 is Left Camera Translation Matrix T1 = translation_vectors(19,:)'; %T1 is right Camera Translation Matrix T2 = translation_vectors(20,:)'; K = intrinsic_matrix; %Load Matched Coordinate load pattern19.txt; load pattern20.txt; m1 = pattern19'; m2 = pattern20'; %% Real R,T λ§λ€κΈ° %% Make Real R,T that is relation between Left, Right Camera % R,T is made by R1,R2 and T1, T1 RealT = T2-T1; %Tλ κ·Έλ₯ λΉΌλ©΄ λλ€. RealR = R2*inv(R1); RealA=rodrigues(RealR)*180/pi; %This is Angle %% P1, P2 λ§λ€κΈ° % Make Projection matrix P1 = K*[eye(3) zeros(3,1)]; %P1 is reference Camera so P1=K[I:O] P2 = K*[RealR RealT]; %% 3μ°¨μ μ λ§λ€κΈ° %P1, P2λ₯Ό μ΄μ©ν 3μ°¨μ μ 볡μ W=[]; %Make 3D coordinate using Direct Linear Transformation %W is wrold coordinate. for i=1:5 A=[ m1(1,i)*P1(3,:) - P1(1,:); m1(2,i)*P1(3,:) - P1(2,:); m2(1,i)*P2(3,:) - P2(1,:); m2(2,i)*P2(3,:) - P2(2,:)]; A(1,:) = A(1,:)/norm(A(1,:)); A(2,:) = A(2,:)/norm(A(2,:)); A(3,:) = A(3,:)/norm(A(3,:)); A(4,:) = A(4,:)/norm(A(4,:)); [u d v] = svd(A); W=[W v(:,4)/v(4,4)]; end %% λ€μ 3μ°¨μ μ μμ ν½μ μ λ§λ€κΈ° % Now, make image coordiante using P1, P2 from W matrix. reip1 = P1*W; % reip1 is recovered image coordiante reip1 = [reip1(1,:)./reip1(3,:); reip1(2,:)./reip1(3,:)] %3μ°¨μμμ 볡μλ μ΄λ―Έμ§ μ’ν % m1 is origin image coordinate m1(:,1:5) %μλ μ΄λ―Έμ§ μ’ν reip2 = P2*W; %reip2 is recovered image coordiante reip2 = [reip2(1,:)./reip2(3,:); reip2(2,:)./reip2(3,:)] %3μ°¨μμμ 볡μλ μ΄λ―Έμ§ μ’ν %m2 is origin image coordinate m2(:,1:5) %μλ μ΄λ―Έμ§ μ’ν-------------------------------------------------------------------------
But, Why reip1 != m1 and reip2 != m2 ??
I cann't know the reason..
Please discuss with me the reason~ ^^
(Please understand my bad english ability. If you point out my mistake, I would correct pleasurably. Thank you!!)
No comments:
Post a Comment