I have been working with geometry shaders and triangles with adjacency for a number of years.
I only just had the case where I want to detect boundaries (-1 in indices 1, 3, 5, given triangle with adjacency indices 0,1,2,3,4,5)...
I am wondering what the input assembler does with invalid (value -1) indices and what is the best way to go about detecting them would be?
I haven't found anything in the OpenGL or D3D11 docs regarding this, and a few tests including:
- Comparing epsilon in distance(input[index0], index[index0+1]) < 1e-6f (I.e., for edge 0, indices 0, 1)
- Comparing epsilon in distance(input[index2], index[index0+1]) < 1e-6f (I.e., for edge 0, indices 2, 1)
- Comparing epsilon in distance(input[index0+1], input[index2+1])) < 1e-6f (I.e., for edge 0, indices 1 and 3).
- Testing for NAN in incoming positions in indices 1, 3, 5. I.e., input[index0+1].worldPosition != input[index0+1].worldPosition, or isnan(input[index0+1].x)
These tests haven't yielded any output for what should be boundary indices.
Obviously, I can scan my adjacency buffer on the system side for -1, and set invalid adjacency indices to [index-1], but I'm now kinda curious as to what the input assembler is doing.
Does anyone know how to test for this?