Showing posts with label Unsolved Algorithm ( I need your thought ). Show all posts
Showing posts with label Unsolved Algorithm ( I need your thought ). Show all posts

2/10/2016

I need your help.. " cuda::meanShiftSegmentation error !! ", error message is "The function/feature is not implemented..."

Someone ask me this problem, but I also suffered this problem and I don't know the solution.
I need yours help.


source code is here.(opencv version is 3.0)
...
#include "opencv2/opencv.hpp"
#include "opencv2\cuda.hpp"
#include "opencv2\cudaimgproc.hpp"

using namespace cv;

int main(int, char)
{


 Mat pictureBGR;
 Mat pictureSegment;
 pictureBGR = imread("2.png"); //unos slike za pretragu

 cuda::GpuMat bgr;
 cuda::GpuMat convertedBgr;
 cuda::GpuMat dstBgr;

   
 bgr.upload(pictureBGR);
 cuda::cvtColor(bgr, convertedBgr, CV_BGR2BGRA); 
  
 TermCriteria criteria(CV_TERMCRIT_EPS, 0, 0.08);
 cuda::meanShiftSegmentation(convertedBgr, dstBgr, 20, 20, 3, criteria);
 dstBgr.download(pictureSegment);


 namedWindow("Display Image", CV_WINDOW_AUTOSIZE);
 imshow("Display Image", pictureSegment);
 waitKey(0);
 return 0;
}
...

error message is here.

OpenCV Error: The function/feature is not implemented (You should explicitly call download method for cuda::GpuMat object) in getMat_, file /home/zburazin/opencv-3.0.0/modules/core/src/matrix.cpp, line 1211
terminate called after throwing an instance of 'cv::Exception'
  what():  /home/zburazin/opencv-3.0.0/modules/core/src/matrix.cpp:1211: error: (-213) You should explicitly call download method for cuda::GpuMat object in function getMat_



In past, I use meanshift function like that.
http://study.marearts.com/2014/12/opencv-meanshiftfiltering-example.html
But nothing changed with the different version of the old code.

Someone know this cause?

Thank you.



3/18/2014

Python Study, Threading example source code

#Thread.start() //thread start
#Thread.run()   //thread main function run
#Thread.join([timeout]) //wait until thread end


from threading import Thread, Lock
import time

count = 10
lock = Lock()



class developer(Thread):

    def __init__(self, name):   #initialize
        Thread.__init__(self)
        self.name = name
        self.fixed = 0

    def run(self):              #thread main function


        global count
        while 1:
            #lock.acquire()      #lock  -> Untie comment, error occurs, I don't know why error occurs...
            if count > 0:
                count -= 1
                #lock.rlease()   #unlock -> Untie comment, error occurs, I don't know why error occurs...
                self.fixed += 1
                time.sleep(0.1)
            else:
                #lock.release()  #unlock -> Untie comment, error occurs, I don't know why error occurs...
                break




dev_list = []
for name in ['Mare1', 'Mare2', 'Mare3']:
    dev = developer(name)       #create thread
    dev_list.append(dev)
    dev.start()                 #thread start


for dev in dev_list:
    dev.join()                  #wait
    print(dev.name, 'fixed', dev.fixed)


#Mare1 fixed 3
#Mare2 fixed 4
#Mare3 fixed 3

9/11/2011

When the sparse bundle adjustment(v1.6) make c++ project source using CMake (v2.8.4), Do you know Why warning message occur?

Firstly, I downloaded SBA(Sparse Bundle Adjustment v1.6) from <here>.
And I prepared CLAPACK Lib from <here>.
And My CMake version is 2.8.4.

Step 1. Directory setting


Step 2. Configuration, Fininsh(in ppoup window, after generator selecting(in my case - vs 2008)


Step 3. After Step2. I modify wrong property in the list(red color)



Step 4. Re-Configure


Step 5. Some Warning is appeared. but I do the Generation progress.


Step 6. In my target directory, project files is generated by cmake. but I can not be sure whether It made well or not?


Who can help me? Why did these warning message appear? How do I do to correct warning message? Plz give me your thought. Thank you!!

8/17/2011

Test Direct Linear Transformation in real image.(matlab source)


I tested the Direct Linear Transformation.
But the method doesn't run well in real image.


I tested below process.
1.Left image coordinate, Right image coordinate in real image. It is also matched point.
2.Get R,T between Left, Right Camera. and Calibration matrix from file.
3.make projection P matrix, (P1,P2)
4.Get World 3D coordinate using DLT.
5.Again, Get image coordinate from World 3D coordinate.


The problem..
recovered image point and input image point is not matched.

Below is the matlab code.
And You can down load. here->< Entire Source Code >

--------------------------------------------------------------------------
%%
clc;
clear all;
close all;

%% 데이터 λ‘œλ”©
%% Data Loading
load rotation_matrices.txt
load translation_vectors.txt
load intrinsic_matrix.txt
load distortion_coeffs.txt

R = rotation_matrices;

%R1은 LeftμΉ΄λ©”λΌμ˜ νŒ¨ν„΄νŒμœΌλ‘œ λΆ€ν„°μ˜ Rotation Matrix
%R1 is Rotation Matrix of Left Camera from Pattern board. Pattern Board has origin
%coordinate (0,0,0)
R1 = reshape(R(19,:),3,3);

%R2 is Left Camera Rotation Matrix.
R2 = reshape(R(20,:),3,3);
%T1 is Left Camera Translation Matrix
T1 = translation_vectors(19,:)';
%T1 is right Camera Translation Matrix
T2 = translation_vectors(20,:)';
K = intrinsic_matrix;
%Load Matched Coordinate
load pattern19.txt;
load pattern20.txt;
m1 = pattern19';
m2 = pattern20';




%% Real R,T λ§Œλ“€κΈ°
%% Make Real R,T that is relation between Left, Right Camera
% R,T is made by R1,R2 and T1, T1
RealT = T2-T1; %TλŠ” κ·Έλƒ₯ λΉΌλ©΄ λœλ‹€.
RealR = R2*inv(R1);
RealA=rodrigues(RealR)*180/pi; %This is Angle




%% P1, P2 λ§Œλ“€κΈ°
% Make Projection matrix
P1 = K*[eye(3) zeros(3,1)]; %P1 is reference Camera so P1=K[I:O]
P2 = K*[RealR RealT];




%% 3차원 점 λ§Œλ“€κΈ°
%P1, P2λ₯Ό μ΄μš©ν•œ 3차원 점 볡원
W=[];




%Make 3D coordinate using Direct Linear Transformation
%W is wrold coordinate.
for i=1:5
A=[ m1(1,i)*P1(3,:) - P1(1,:);
m1(2,i)*P1(3,:) - P1(2,:);
m2(1,i)*P2(3,:) - P2(1,:);
m2(2,i)*P2(3,:) - P2(2,:)];
A(1,:) = A(1,:)/norm(A(1,:));
A(2,:) = A(2,:)/norm(A(2,:));
A(3,:) = A(3,:)/norm(A(3,:));
A(4,:) = A(4,:)/norm(A(4,:));



[u d v] = svd(A);
W=[W v(:,4)/v(4,4)];
end




%% λ‹€μ‹œ 3차원 μ μ—μ„œ ν”½μ…€ 점 λ§Œλ“€κΈ°
% Now, make image coordiante using P1, P2 from W matrix.
reip1 = P1*W;
% reip1 is recovered image coordiante
reip1 = [reip1(1,:)./reip1(3,:); reip1(2,:)./reip1(3,:)] %3μ°¨μ›μ—μ„œ λ³΅μ›λœ 이미지 μ’Œν‘œ
% m1 is origin image coordinate
m1(:,1:5) %μ›λž˜ 이미지 μ’Œν‘œ
reip2 = P2*W;
%reip2 is recovered image coordiante
reip2 = [reip2(1,:)./reip2(3,:); reip2(2,:)./reip2(3,:)] %3μ°¨μ›μ—μ„œ λ³΅μ›λœ 이미지 μ’Œν‘œ
%m2 is origin image coordinate
m2(:,1:5) %μ›λž˜ 이미지 μ’Œν‘œ
-------------------------------------------------------------------------



But, Why reip1 != m1 and reip2 != m2 ??
I cann't know the reason..
Please discuss with me the reason~ ^^


(Please understand my bad english ability. If you point out my mistake, I would correct pleasurably. Thank you!!)