Jump to content
• Advertisement

# How to calculate the energy and entropy of an image

## Recommended Posts

How does one go about calculating the energy and entropy of an image?

#### Share this post

##### Share on other sites
Advertisement

This is not how you use a forum. You can enter the question on Google and see what comes out. If that doesn't give you the results you want, you can try to use this forum. For us to be able to give you better answers than Google, you need to give us the appropriate context:

• What are you trying to do?
• What have you tried?
• What problems did you encounter?
• Where did you read about these concepts?

I bet we can give you decent answers if you pose the question properly.

#### Share this post

##### Share on other sites

Thanks for the guidance alvaro.

I'm trying to get the energy and entropy measurements for an image. They fascinate me.

So far, I've found on google that the per-pixel energy can be considered to be related to the x and y gradients, like: E = \sqrt{g_x^2 + g_y^2}. It reminds me of the potential energy due to gravity.

I assume that just adding the energy of all pixels together gives you the per-image energy.

As for entropy, it is per-image, and I will use a std::map to count the number of pixels there are for each distinct colour, like you would when obtaining a histogram. I have several codes to calculate the entropy of a string: https://github.com/sjhalayka/entropy-calculation

I'm just looking for a second opinion on the interpretations of energy and entropy. In fact, I am asking for a person who asked this question on http://answers.opencv.org/question/180503/energy-computation-of-dct-of-image/ who is looking to measure the energy of an image using the DCT.

Edited by sjhalayka

#### Share this post

##### Share on other sites

Energy is usually defined as the integral of the squared amplitude. Parseval's theorem says that you can compute this on the Fourier transform of the signal and you'll get the same answer.

Entropy critically depends on having a probability distribution over images, and then it's simply defined as -log(probability).

#### Share this post

##### Share on other sites

Thanks for your comments. Is energy proportional to frequency, like for light?

It's not clear to me how you obtained the value S = -log(probability). I am familiar with the S_binary = -sum(p_i ln(p_i)) / ln(2) equation, which simplifies down to S_binary = ln(n)/ln(2) where n is the number of equiprobable states.

The sum of probabilities must equal 1, right?

Edited by sjhalayka

#### Share this post

##### Share on other sites

No, I don't think energy is proportional to frequency. If you want a physical model to think about, think of the power dissipated by a resistor when alternating current is applied to it. You'll see that the power is proportional to the squared amplitude (I^2*R or V^2/R) and independent of the frequency.

It is true that "entropy" is not quite the right word for the quantity -log(probability). The correct term is something like "information content", while "entropy" could be defined as "expected information content". But the notion of entropy only makes sense when applied to a probability distribution, not an individual image. Again, I would need to know where you picked up this term or what you want to use it for in order to give you a better answer.

#### Share this post

##### Share on other sites

Thank you for your clarification. How does one calculate the amplitudes of a pixel?

I am applying the entropy function to the pixel distribution. To keep count of all the distinct pixel values, I am using a std::map<pixel, size_t>, where pixel is the pixel colour, be it 3 channel BGR or 1 channel greyscale, and size_t is the count.

Edited by sjhalayka

#### Share this post

##### Share on other sites
3 hours ago, alvaro said:

Energy is usually defined as the integral of the squared amplitude. Parseval's theorem says that you can compute this on the Fourier transform of the signal and you'll get the same answer.

While I agree for signals that can be negative (e.g. audio), I'm not sure that's correct for images, since they represent the physical light intensity (which is proportional to energy and can't be negative). I think the energy in an image would be just the sum of the pixel values at each wavelength, since the integral of the light wave's squared amplitude has already been done by the image sensor.

Edited by Aressera

#### Share this post

##### Share on other sites

@Aressera -- You make an interesting point, and it's intuitive. The three intensities (photon count per second * photon energy (red)... etc.) are added up to get the total intensity (energy per unit area per unit time). Right?

@alvaro -- Is the amplitude squared the pixel intensity?

Edited by sjhalayka

#### Share this post

##### Share on other sites

Yes, I found on google that intensity is proportional to amplitude squared. It's helpful to know that the intensity (energy per unit area per unit time) is power per unit area.

Edited by sjhalayka

## Create an account or sign in to comment

You need to be a member in order to leave a comment

## Create an account

Sign up for a new account in our community. It's easy!

Register a new account

## Sign in

Already have an account? Sign in here.

Sign In Now

• Advertisement
• Advertisement

• ### Popular Contributors

1. 1
Rutin
33
2. 2
3. 3
4. 4
5. 5
• Advertisement

• 13
• 9
• 9
• 9
• 9
• ### Forum Statistics

• Total Topics
633330
• Total Posts
3011388
• ### Who's Online (See full list)

There are no registered users currently online

×

## Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!