Deep learning study - cross entropy #5


The whole process of deep learning is as described above.
The final step in cross entropy is compared to the classification and labeling values.



S and L is vector. L is final result that is determined by the person.
D function means distance value.
The smaller distance value means that the result is correct.
Thus, W, b parameter adjust, the D value should be smaller.

This is matlab test.

case 1.
S = [0.7 0.2 0.1];
L = [1.0 0 0];
- sum( L.*log(S) )

=>  0.3567

case 2.
S = [0.7 0.2 0.1];
L = [0 0 1.0];
- sum( L.*log(S) )

 2.3026



In case 1, since the result of softmax is similar to the label, distance is small than case 2.



Comments

Popular posts from this blog

OpenCV Stitching example (Stitcher class, Panorama)

(OpenCV Study) Background subtractor MOG, MOG2, GMG example source code (BackgroundSubtractorMOG, BackgroundSubtractorMOG2, BackgroundSubtractorGMG)

Example source code of extract HOG feature from images, save descriptor values to xml file, using opencv (using HOGDescriptor )

Real-time N camera stitching Class.

8 point algorithm (Matlab source code) / The method to get the Fundamental Matrix and the Essential matrix

Optical Flow sample source code using OpenCV

Video Stabilization example source code, (using cvFindHomography, cvWarpPerspective functions in openCV)

(OpenCV Study) calcOpticalFlowFarneback example source code ( dense optical flow )

yuv422(YUYV) to RGB and RGB to yuv422(YUYV), (Using OpenCV and TBB)

OpenCV Drawing Example, (line, circle, rectangle, ellipse, polyline, fillConvexPoly, putText, drawContours)