This topic is 2892 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I am new about GLSL. I have a simple question about: How to sum all pixel values in the fragment shader. I want to measure simple differences between two images in the fragment shader (These two images are outcomes from some image processing procedures using GLSL). I can calculate the difference for each pixel using something like: gl_FragColor.r =abs(img1Color.r-img2Color.r); gl_FragColor.g =abs(img1Color.g-img2Color.g); gl_FragColor.b =abs(img1Color.b-img2Color.b); So, I can send this "difference" image to the CPU memory. Then I sum all pixels of this image on CPU to get the total difference. Can I compute summation in the fragment shader? (Because I am only need the total difference value.) Besides transferring the whole image from GPU to CPU is more expensive than transferring the single float point value. Can I perform pixel summation in the fragment shader by some means? Thank you, looking forward to hearing you. Mutated_Fantasy

##### Share on other sites
You can. First, get those images as textures. After that, you can read them in the fragment shader by sampling them. Just follow the advice on the link on how to properly make your textures available there. The last step would be to bind your shader program and render a quad with the right texture coordinates so that the fragment shader operates on all the texture, something like
	create_textures();	bind_textures();	bind_shader();	//if you've set a projection like glOrtho(-10,10,-10,10,-1,1);,	//you would probably render a quad with this size:	glBegin(GL_QUADS);		glTexCoord2f(0,0); glVertex2f(-10,-10);		glTexCoord2f(1,0); glVertex2f(+10,-10);		glTexCoord2f(1,1); glVertex2f(+10,+10);		glTexCoord2f(0,1); glVertex2f(-10,+10);	glEnd();

Of course you should have a non-glBegin way to render the quad, it was just to show its size.

##### Share on other sites
Thanks, fcoelho

Yes, that is the part of the solution. But more important is that I want to compute the summation of all pixels on GPU. This may be opposite to the usual parallel pipeline computation on GPU. So I hope someone can show a workaround, if it is possible.

Regards,
Mutated_Fantasy

##### Share on other sites
OK, let me give a premature idea for this question. Assuming the image size is Power Of Two.

1. We can begin with a N by N image, and generate a N/2 by N/2 image by summing all 4 neighbour pixels (e.g sum all 1,2,3,4 )into one new pixel P(i,j).
-------
|1 |2 |
-------
|3 |4 |
-------

2. If N/2 by N/2 reaches the size of 1 by 1 then we stop, otherwise we repeat the step 1.

The problem with the step 1 is: if the computation is parallel, we may simultanouesly write to the same P(i,j). Then the value of P(i,j) may be undefined.

How can I get around this?

##### Share on other sites
Your idea that you propose is a correct solution to this (called a "Reduction shader", you can search for this for more info).

Quote:
 The problem with the step 1 is: if the computation is parallel, we may simultanouesly write to the same P(i,j). Then the value of P(i,j) may be undefined.

I think you are incorrect here. You won't ever be executing the fragment shader several times on the same pixel.

Say you start with a 4x4 texture. First you reduce it to 2x2, and then in the next round you reduce it to 1x1.

In the first stage, your render target is only 4 pixels, so the fragment shader will only run 4 times (not 16 times). Each time the fragment shader execute it takes four samples from the previous image, but it only writes once to the target.

##### Share on other sites
Quote:
 In the first stage, your render target is only 4 pixels, so the fragment shader will only run 4 times (not 16 times). Each time the fragment shader execute it takes four samples from the previous image, but it only writes once to the target.

Thank you very much, karwosts. Yeah! "4 times (not 16 times)"

I was thinking of the opposite way that runs 16 times to render 4 pixels. It is called Reduction Shader, next time I'd better properly refer to it.

Best regards,
Mutated_Fantasy