On the terminal, input 'python -V', then version will be showed.
In the mac, python is already installed.
1/16/2014
1/15/2014
(OpenCV Stitching) matchesGraphAsString function
"matchesGraphAsString " is useful function to see the stitching relationship graph.
The function gives the result as to string.
The matching method is introduced on
cpu version -> http://feelmare.blogspot.kr/2013/12/finding-largest-subset-images-that-is.html
gpu version-> http://feelmare.blogspot.kr/search?updated-max=2014-01-14T00:52:00-08:00&max-results=2&start=2&by-date=false
After get pairwise_matches, you can get the result of stitching grouping as this example source code.
...
The function gives the result as to string.
As figure, we can know m6-m7-m8-m9 is linked and S1, S6 is not correlation with the group.
You have to get pairwise_matches before using the function.The matching method is introduced on
cpu version -> http://feelmare.blogspot.kr/2013/12/finding-largest-subset-images-that-is.html
gpu version-> http://feelmare.blogspot.kr/search?updated-max=2014-01-14T00:52:00-08:00&max-results=2&start=2&by-date=false
After get pairwise_matches, you can get the result of stitching grouping as this example source code.
...
float conf_thresh=1.0; vector< cv::String > img_names; img_names.push_back( "m7.jpg"); img_names.push_back( "S1.jpg"); img_names.push_back( "m9.jpg"); img_names.push_back( "m6.jpg"); img_names.push_back( "S6.jpg"); img_names.push_back( "m8.jpg"); ofstream f("graph.txt"); f << detail::matchesGraphAsString(img_names, pairwise_matches, conf_thresh);---
(Math) Mathematical symbol, let's remember~ 0≦t< 1 --> t∈[0,1)
0 ≦ t < 1
-> t ∈ [0,1)
don't forget~!!
1/14/2014
(Arduino Study) Led on/off using piezo speaker knock.
source code
...
const int ledPin = 13; const int knockSensor = A0; const int threshold = 100; int sensorReading = 0; int ledState = LOW; void setup(){ pinMode(ledPin, OUTPUT); Serial.begin(9600); } void loop(){ sensorReading = analogRead(knockSensor); Serial.println(sensorReading); if(sensorReading >= threshold) { ledState = !ledState; digitalWrite(ledPin, ledState); Serial.println("Knock!"); } delay(100); }---
(OpenCV, MatchesInfo) MatchesInfo includes correlation information between matched images.
After using BestOf2NearestMatcher function,
We can see correlation value between matched images.
MatchesInfo has follow element.
http://feelmare.blogspot.kr/2013/12/finding-largest-subset-images-that-is.html-> you can see find feature and matching example source.
we can see correlation value from below source code.
...
In here, confidence value is calculate by
...
If cofidenc value is lower than 1, we think the images are not relative image(no overlap image).
We can see correlation value between matched images.
MatchesInfo has follow element.
struct CV_EXPORTS MatchesInfo { MatchesInfo(); MatchesInfo(const MatchesInfo &other); const MatchesInfo& operator =(const MatchesInfo &other); int src_img_idx, dst_img_idx; // Images indices (optional) std::vector matches; std::vector inliers_mask; // Geometrically consistent matches mask int num_inliers; // Number of geometrically consistent matches Mat H; // Estimated homography double confidence; // Confidence two images are from the same panorama };
http://feelmare.blogspot.kr/2013/12/finding-largest-subset-images-that-is.html-> you can see find feature and matching example source.
we can see correlation value from below source code.
...
printf("pairwise_matches %d \n", pairwise_matches.size() ); for(int i=0; i < pairwise_matches.size(); ++i) { printf("%d \n", i ); printf("%d -> %d \n", pairwise_matches[i].src_img_idx, pairwise_matches[i].dst_img_idx ); printf("num inliers = %d\n", pairwise_matches[i].num_inliers); cout << "H " << pairwise_matches[i].H << endl; printf("confidence = %lf \n", pairwise_matches[i].confidence ); printf("---\n"); }---
In here, confidence value is calculate by
...
// These coeffs are from paper M. Brown and D. Lowe. "Automatic Panoramic Image Stitching // using Invariant Features" matches_info.confidence = matches_info.num_inliers / (8 + 0.3 * matches_info.matches.size()); // Set zero confidence to remove matches between too close images, as they don't provide // additional information anyway. The threshold was set experimentally. matches_info.confidence = matches_info.confidence > 3. ? 0. : matches_info.confidence;---
If cofidenc value is lower than 1, we think the images are not relative image(no overlap image).
1/13/2014
(OpenCV GPU) Finding largest subset images that is only adjacent(subsequnce) images, (OpenCV, SurfFeaturesFinder, BestOf2NearestMatcher, leaveBiggestComponent funcions example souce code)
This is GPU version of this page ->http://feelmare.blogspot.kr/2013/12/finding-largest-subset-images-that-is.html
Please refer detail description on that page.
The gpu mode is showed about 10 times faster than gpu processing.
I think GPU programing is very ensential in computer vision, if you don't have contraint on the performance of equipment.
...
#include < stdio.h > #include < opencv2\opencv.hpp > #include < opencv2\features2d\features2d.hpp > #include < opencv2\nonfree\features2d.hpp > #include < opencv2\stitching\detail\matchers.hpp > #include < opencv2\stitching\stitcher.hpp > #ifdef _DEBUG #pragma comment(lib, "opencv_core247d.lib") //#pragma comment(lib, "opencv_imgproc247d.lib") //MAT processing //#pragma comment(lib, "opencv_objdetect247d.lib") #pragma comment(lib, "opencv_gpu247d.lib") #pragma comment(lib, "opencv_features2d247d.lib") #pragma comment(lib, "opencv_highgui247d.lib") //#pragma comment(lib, "opencv_ml247d.lib") #pragma comment(lib, "opencv_stitching247d.lib"); #pragma comment(lib, "opencv_nonfree247d.lib"); #else #pragma comment(lib, "opencv_core247.lib") //#pragma comment(lib, "opencv_imgproc247.lib") //#pragma comment(lib, "opencv_objdetect247.lib") #pragma comment(lib, "opencv_gpu247.lib") #pragma comment(lib, "opencv_features2d247.lib") #pragma comment(lib, "opencv_highgui247.lib") //#pragma comment(lib, "opencv_ml247.lib") #pragma comment(lib, "opencv_stitching247.lib"); #pragma comment(lib, "opencv_nonfree247.lib"); #endif using namespace cv; using namespace std; void main() { //processign tiem measurement unsigned long AAtime=0, BBtime=0; //processing time measure AAtime = getTickCount(); vector< Mat > vImg; Mat rImg; vImg.push_back( imread("./stitching_img/m7.jpg",0) ); vImg.push_back( imread("./stitching_img/S1.jpg",0) ); vImg.push_back( imread("./stitching_img/m9.jpg",0) ); vImg.push_back( imread("./stitching_img/m6.jpg",0) ); vImg.push_back( imread("./stitching_img/S6.jpg",0) ); vImg.push_back( imread("./stitching_img/m8.jpg",0) ); //feature extract gpu::SURF_GPU FeatureFinder_gpu(400); gpu::GpuMat inImg_g; gpu::GpuMat src_keypoints_gpu, src_descriptors_gpu; vector< cv::KeyPoint> src_keypoints; vector< float> src_descriptors; vector< detail::ImageFeatures> features; for(int i=0; i< vImg.size(); ++i) { detail::ImageFeatures F; inImg_g.upload(vImg[i]); FeatureFinder_gpu(inImg_g, gpu::GpuMat(), src_keypoints_gpu, src_descriptors_gpu, false); //descriptor down FeatureFinder_gpu.downloadKeypoints(src_keypoints_gpu, src_keypoints); FeatureFinder_gpu.downloadDescriptors(src_descriptors_gpu, src_descriptors); //make ImageFeatures F.img_idx=i; F.img_size = Size(vImg[i].cols, vImg[i].rows); F.keypoints = src_keypoints; Mat M = Mat(src_descriptors.size()/64.0, 64, CV_32FC1); F.descriptors = M; memcpy(M.data, src_descriptors.data(), src_descriptors.size()*sizeof(float)); //Add vector features.push_back(F); //data confirm printf("%d - key:%d \n", features[i].img_idx, features[i].keypoints.size() ); printf(" des:cols:%d, rows:%d \n", features[i].descriptors.cols, features[i].descriptors.rows); } //match vector< int> indices_; double conf_thresh_ = 1.0; Mat matching_mask; vector< detail::MatchesInfo> pairwise_matches; detail::BestOf2NearestMatcher Matcher(true); Matcher(features, pairwise_matches, matching_mask); Matcher.collectGarbage(); //grouping indices_ = detail::leaveBiggestComponent(features, pairwise_matches, (float)conf_thresh_); Matcher.collectGarbage(); for (size_t i = 0; i < indices_.size(); ++i) { printf("%d \n", indices_[i] ); } //Processing time measurement BBtime = getTickCount(); printf("Processing time = %.2lf(sec) \n", (BBtime - AAtime)/getTickFrequency() ); }
(Arduino Study) temperature sensing
Temperature sensing using thermister
Thermister gives resistance value depend on temperature changing.
...
Thermister gives resistance value depend on temperature changing.
...
#include < math.h> void setup(void) { Serial.begin(9600); } double Thermister(int RawADC){ double Temp; Temp = log(((10240000/RawADC) - 10000)); Temp = 1 / (0.001129148 + (0.000234125 * Temp) + (0.0000000876741 * Temp * Temp * Temp)); Temp = Temp - 273.15; return Temp; } void printTemp(void){ double fTemp; double temp = Thermister( analogRead(0) ); //read sensor value Serial.println("Temperature is:"); Serial.println(temp); } void loop(void) { printTemp(); delay(1000); }---
(OpenCV, data type change, copy) vector to Mat, Mat to vector
This post is about how to copy Mat data to vector and copy vector data to Mat.
Reference this example source code.
printf("/////////////////////////////////////////////////////////////\n"); printf("//vector to Mat\n"); int r=3; int c=4; vector< float> Vf; //insert value int cnt=0; for(int i=0; i< c; ++i) for(int j=0; j< r; ++j) Vf.push_back(cnt++); //create Mat Mat M=Mat(r,c,CV_32FC1); //copy vector to mat memcpy(M.data,Vf.data(),Vf.size()*sizeof(float)); //print Mat cout < < M < < endl; printf("/////////////////////////////////////////////////////////////\n"); printf("//Mat to vector\n"); vector< float> Vf2; //copy mat to vector Vf2.assign((float*)M.datastart, (float*)M.dataend); //confirm cnt=0; for(int i=0; i< c; ++i) { for(int j=0; j< r; ++j) printf("%lf ", Vf2[cnt++]); printf("\n"); }
--
You want to copy image buffer to Mat example source code.
Reference on this page -> http://feelmare.blogspot.kr/2014/01/opencv-mat-class-image-bufferpoint-copy.html
Subscribe to:
Posts (Atom)
-
make well divided linear coordinate And make pair coordinate Please see code for detail explanation. import numpy as np import cv2 ...
-
As you can see in the following video, I created a class that stitching n cameras in real time. https://www.youtube.com/user/feelmare/sear...
-
In past, I wrote an articel about YUV 444, 422, 411 introduction and yuv <-> rgb converting example code. refer to this page -> ht...
-
Image size of origin is 320*240. Processing time is 30.96 second took. The result of stitching The resul...
-
* Introduction - The solution shows panorama image from multi images. The panorama images is processing by real-time stitching algorithm...
-
Proceed with the project to update the 2012 version, or that you must reinstall Visual Studio 2010. If you are using Visual Studio 2...
-
1. Map : Tasks read from and write to specific data elements. 2. Gather : each calculation gathers input data elements together from di...
-
fig 1. Left: set 4 points (Left Top, Right Top, Right Bottom, Left Bottom), right:warped image to (0,0) (300,0), (300,300), (0,300) Fi...
-
opencv lecture 4-1 example code < gist start > < gist end >
-
//Singular value decomposition : Mat w, u, v; SVDecomp(data, w, u, v); // A = U W V^T //The flags cause U and V to be returned transpose...
