# How to calculate the energy and entropy of an image

## Recommended Posts

How does one go about calculating the energy and entropy of an image?

##### Share on other sites

This is not how you use a forum. You can enter the question on Google and see what comes out. If that doesn't give you the results you want, you can try to use this forum. For us to be able to give you better answers than Google, you need to give us the appropriate context:

• What are you trying to do?
• What have you tried?
• What problems did you encounter?

I bet we can give you decent answers if you pose the question properly.

##### Share on other sites

Thanks for the guidance alvaro.

I'm trying to get the energy and entropy measurements for an image. They fascinate me.

So far, I've found on google that the per-pixel energy can be considered to be related to the x and y gradients, like: E = \sqrt{g_x^2 + g_y^2}. It reminds me of the potential energy due to gravity.

I assume that just adding the energy of all pixels together gives you the per-image energy.

As for entropy, it is per-image, and I will use a std::map to count the number of pixels there are for each distinct colour, like you would when obtaining a histogram. I have several codes to calculate the entropy of a string: https://github.com/sjhalayka/entropy-calculation

I'm just looking for a second opinion on the interpretations of energy and entropy. In fact, I am asking for a person who asked this question on http://answers.opencv.org/question/180503/energy-computation-of-dct-of-image/ who is looking to measure the energy of an image using the DCT.

Edited by sjhalayka

##### Share on other sites

Energy is usually defined as the integral of the squared amplitude. Parseval's theorem says that you can compute this on the Fourier transform of the signal and you'll get the same answer.

Entropy critically depends on having a probability distribution over images, and then it's simply defined as -log(probability).

##### Share on other sites

Thanks for your comments. Is energy proportional to frequency, like for light?

It's not clear to me how you obtained the value S = -log(probability). I am familiar with the S_binary = -sum(p_i ln(p_i)) / ln(2) equation, which simplifies down to S_binary = ln(n)/ln(2) where n is the number of equiprobable states.

The sum of probabilities must equal 1, right?

Edited by sjhalayka

##### Share on other sites

No, I don't think energy is proportional to frequency. If you want a physical model to think about, think of the power dissipated by a resistor when alternating current is applied to it. You'll see that the power is proportional to the squared amplitude (I^2*R or V^2/R) and independent of the frequency.

It is true that "entropy" is not quite the right word for the quantity -log(probability). The correct term is something like "information content", while "entropy" could be defined as "expected information content". But the notion of entropy only makes sense when applied to a probability distribution, not an individual image. Again, I would need to know where you picked up this term or what you want to use it for in order to give you a better answer.

##### Share on other sites

Thank you for your clarification. How does one calculate the amplitudes of a pixel?

I am applying the entropy function to the pixel distribution. To keep count of all the distinct pixel values, I am using a std::map<pixel, size_t>, where pixel is the pixel colour, be it 3 channel BGR or 1 channel greyscale, and size_t is the count.

Edited by sjhalayka

##### Share on other sites
3 hours ago, alvaro said:

Energy is usually defined as the integral of the squared amplitude. Parseval's theorem says that you can compute this on the Fourier transform of the signal and you'll get the same answer.

While I agree for signals that can be negative (e.g. audio), I'm not sure that's correct for images, since they represent the physical light intensity (which is proportional to energy and can't be negative). I think the energy in an image would be just the sum of the pixel values at each wavelength, since the integral of the light wave's squared amplitude has already been done by the image sensor.

Edited by Aressera

##### Share on other sites

@Aressera -- You make an interesting point, and it's intuitive. The three intensities (photon count per second * photon energy (red)... etc.) are added up to get the total intensity (energy per unit area per unit time). Right?

@alvaro -- Is the amplitude squared the pixel intensity?

Edited by sjhalayka

##### Share on other sites

Yes, I found on google that intensity is proportional to amplitude squared. It's helpful to know that the intensity (energy per unit area per unit time) is power per unit area.

Edited by sjhalayka

##### Share on other sites

The code to calculate the entropy of a greyscale image:

	#include <opencv2/opencv.hpp>
using namespace cv;
#pragma comment(lib, "opencv_world331.lib")
#include <iostream>
#include <vector>
#include <map>
using namespace std;
int main(void)
{
if (frame.empty())
{
return -1;
}
map<unsigned char, size_t> pixel_map;
for (int j = 0; j < frame.rows; j++)
for (int i = 0; i < frame.cols; i++)
pixel_map[frame.at<unsigned char>(j, i)]++;
//cout << frame.rows*frame.cols << " " << pixel_map.size() << endl;
double entropy = 0;
for (map<unsigned char, size_t>::const_iterator ci = pixel_map.begin(); ci != pixel_map.end(); ci++)
{
double probability = ci->second / static_cast<double>(frame.rows*frame.cols);
entropy += probability * log(probability);
}
entropy = -entropy;
cout << entropy << endl;
return 0;
}



##### Share on other sites

I see. So you think of the image as providing you with a probability distribution over colors: pick a random pixel in the image and check its color. Then the entropy of that probability distribution is well defined.

##### Share on other sites

I think we're on the same page, yes. Not sure how my code got mangled so bad LOL.

I found this one link that shows the relationship between intensity and amplitude squared: http://muchomas.lassp.cornell.edu/p214/Notes/Interference/node6.html

I use the entropy to get the number of bits needed to classify n equiprobable messages -- ceil(ln(n)/ln(2)), like with a neural network.

Edited by sjhalayka

## Create an account

Register a new account