SOLVED: Compute shader atomic shared variable problem

This topic is 1510 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

SOLUTION: Ugh. Don't do

minDepth = atomicMin(minDepth, depth);

Just do

atomicMin(minDepth, depth);

The assignment breaks the atomicity.

I was doing some experiments on tiled deferred shading and ended up with a very strange problem. I tried to compute the minimum and maximum depth of each tile using two shared uints.

shared uint minDepth = 0xFFFFFFFF;
shared uint maxDepth = 0;


The idea was to compute the maximum and minimum depth using atomicMin() and atomicMax() like this:

uint depth = ...;
minDepth = atomicMin(minDepth, depth); //DON'T DO THIS, SEE ABOVE
maxDepth = atomicMax(maxDepth, depth); //DON'T DO THIS, SEE ABOVE
barrier();


However, this is not working correctly. I seem to be getting synchronization problems since the result flickers a lot despite the camera not moving so the depth buffer identical. barrier() has no effect at all on the result.

I fail to see how this could possibly happen. The atomic min and max functions do not seem to work correctly, with some pixels being randomly ignored. For testing, I decided to output the following:

if(minDepth > d){
result = vec3(5, 0, 0);
}else{
result = vec3(0);
}


In other words, if the depth of the current pixel is less than the calculated minimum depth of the tile, make it bright red. Here's the horrifying flickering result:

What am I doing wrong? How can this possibly happen when I'm using barrier()? Why isn't barrier() nor memoryBarrierShared() working as they should?

SOLVED, see above.

Share on other sites

Check the reference on atomicMin () http://www.opengl.org/sdk/docs/man/

I states: performs an atomic comparison of data to the contents of mem, writes the minimum value into mem and returns the original contents of mem from before the comparison occured

The second part is the interesting one in your case. You are passing minDepth into the parameter 'mem'. So AFTER the function returns minDepth has already the minimum assigned. But immediately after that you are assigning minDepth the value it has had BEFORE the function call. I guess this is what breaks the synchronized behaviour.

Why are you assigning the value anyway?

My guess is leave the assignment away and it will work quite fine. (At least my implementation does and it looks pretty much the same in the parts you showed me here)

Unfortunately I'm terrible at reading posts.... at least i did some explaining of the error :)

Edited by Wh0p

Share on other sites

SOLUTION: Ugh. Don't do
minDepth = atomicMin(minDepth, depth);
Just do
atomicMin(minDepth, depth);
The assignment breaks the atomicity.

I think the problem is that atomicMin() returns the original value of the variable.  If the depth variable is less than minDepth, it sets minDepth to depth, but then returns the original value of minDepth.  The problem isn't the atomic operation but just undoing the minimum comparison.

Share on other sites

Yes, I missed the part in the specification that said that the variable you pass into the atomic functions (in my case minDepth and maxDepth) are inout, not just in. It all made sense once I got that. >_>

Edited by theagentd

1. 1
2. 2
frob
16
3. 3
4. 4
Rutin
11
5. 5

• 13
• 13
• 61
• 14
• 15
• Forum Statistics

• Total Topics
632125
• Total Posts
3004250

×