How to calculate the energy and entropy of an image

Started by
12 comments, last by sjhalayka 6 years, 4 months ago

The code to calculate the entropy of a greyscale image:


	#include <opencv2/opencv.hpp>
using namespace cv;
#pragma comment(lib, "opencv_world331.lib")
	#include <iostream>
#include <vector>
#include <map>
using namespace std;
	int main(void)
{
    Mat frame = imread("puppets.png", CV_LOAD_IMAGE_GRAYSCALE);
	    if (frame.empty())
    {
        cout << "Error loading image file" << endl;
        return -1;
    }
	    map<unsigned char, size_t> pixel_map;
	    for (int j = 0; j < frame.rows; j++)
        for (int i = 0; i < frame.cols; i++)
            pixel_map[frame.at<unsigned char>(j, i)]++;
	    //cout << frame.rows*frame.cols << " " << pixel_map.size() << endl;
	    double entropy = 0;
	    for (map<unsigned char, size_t>::const_iterator ci = pixel_map.begin(); ci != pixel_map.end(); ci++)
    {
        double probability = ci->second / static_cast<double>(frame.rows*frame.cols);
        entropy += probability * log(probability);
    }
	    entropy = -entropy;
	    cout << entropy << endl;
	    return 0;
}
 
	

Advertisement

I see. So you think of the image as providing you with a probability distribution over colors: pick a random pixel in the image and check its color. Then the entropy of that probability distribution is well defined.

 

I think we're on the same page, yes. Not sure how my code got mangled so bad LOL.

I found this one link that shows the relationship between intensity and amplitude squared: http://muchomas.lassp.cornell.edu/p214/Notes/Interference/node6.html

I use the entropy to get the number of bits needed to classify n equiprobable messages -- ceil(ln(n)/ln(2)), like with a neural network.

This topic is closed to new replies.

Advertisement