Domain vs Geomtry Shader

Started by
6 comments, last by Spazzarama 9 years, 7 months ago

Hi, first sorry for my poor english,but here it goes, i recently started with tessellation shaders, to be able to render a big terrain with easy culling and LOD implementation, but im facing some issues, right now this is what i am doing:

1.- Draw call of a flat mesh of X*X quads.

2.- Vertex shader does nothing, it only sends the information to the hull shader.

3.- Hull shader gets a tessellation factor depending on distance and it also does frustrum culling checking if the vertex is inside the frustrum.

4.- Domain shader uses a displacement map (heightmap) to move the new vertices to its "y" position.

I still need to calculate normals, tangents and binormals or bitangents however you call them, to be able to apply lighting on the pixel shader, and i also want to implement backface culing and may be occlusion culling using normals.

I tried adding a geometry shader stage, which right now is doing nothing but passing the information to the pixel shader, no calculations, but this still makes a big drop on the fps, i dont want to think what will happend if i add the normal calculations there.

Im not sure what should i do, or what am i doing wrong, should i calculate normals on domain shader, geometry shader, or which should be a nice solution, thank you.

Advertisement
Define 'big drop in fps'?

You should definitely try to calculate your normals etc in the domain shader. This is the 3rd and final stage of the optional tessellation stages and is specifically used to calculate the final vertex position and data of the subdivided point. Because you are using the tessellation pipline the domain shader is going to be called no matter what, whereas the geometry shader stage is still optional and will incur additional cost (even for an empty shader).

Depending on the domain (tri or quad ) you may need to use barycentric, bilinear or bicubic interpolation to determine the correct values.

You will want to implement backface culling and/or dynamic LoD within the hull shader.

Below is an example domain shader taken from Chapter 5: Applying Hardware Tessellation of my book Direct3D Rendering Cookbook. It uses bilinear interpolation and a combination of patch and constant data for the inputs:


// This domain shader applies control point weighting with bilinear interpolation using the SV_DomainLocation
[domain("quad")]
PixelShaderInput DS_Quads( HS_QuadPatchConstant constantData, const OutputPatch<DS_ControlPointInput, 4> patch, float2 uv : SV_DomainLocation )
{
    PixelShaderInput result = (PixelShaderInput)0;

    // Interpolate using bilerp
    float4 c[4];
    float3 p[4];
    [unroll]
    for(uint i=0;i<4;i++) {
        p[i] = patch[i].Position;
        c[i] = patch[i].Diffuse;
    }
    float3 position = Bilerp(p, uv);
    float2 UV = Bilerp(constantData.TextureUV, uv);
    float4 diffuse = Bilerp(c, uv);
    float3 normal = Bilerp(constantData.NormalW, uv);

    // Prepare pixel shader input:
    // Transform world position to view-projection
    result.PositionV = mul( float4(position,1), ViewProjection );
    result.Diffuse = diffuse;
    result.UV = UV;
    result.NormalW = normal;
    result.PositionW = position;
    
    return result;
}

And bilinear interpolation on float2, float3 and float4 properties for the simple quad domain:


//*********************************************************
// QUAD bilinear interpolation
float2 Bilerp(float2 v[4], float2 uv)
{
    // bilerp the float2 values
    float2 side1 = lerp( v[0], v[1], uv.x );
    float2 side2 = lerp( v[3], v[2], uv.x );
    float2 result = lerp( side1, side2, uv.y );
	
    return result;    
}

float3 Bilerp(float3 v[4], float2 uv)
{
    // bilerp the float3 values
    float3 side1 = lerp( v[0], v[1], uv.x );
    float3 side2 = lerp( v[3], v[2], uv.x );
    float3 result = lerp( side1, side2, uv.y );
	
    return result;    
}

float4 Bilerp(float4 v[4], float2 uv)
{
    // bilerp the float4 values
    float4 side1 = lerp( v[0], v[1], uv.x );
    float4 side2 = lerp( v[3], v[2], uv.x );
    float4 result = lerp( side1, side2, uv.y );
	
    return result;    
}

.

For tri domains you would use barycentric interpolation - something like the following:


//*********************************************************
// TRIANGLE interpolation (using Barycentric coordinates)
/*
    barycentric.xyz == uvw
    w=1-u-v
    P=w*A+u*B+v*C
  C ______________ B
    \.    w    . /
     \  .    .  / 
      \    P   /
       \u  . v/
        \  . /
         \ ./
          \/
          A
*/
float2 BarycentricInterpolate(float2 v0, float2 v1, float2 v2, float3 barycentric)
{
    return barycentric.z * v0 + barycentric.x * v1 + barycentric.y * v2;
}

float3 BarycentricInterpolate(float3 v0, float3 v1, float3 v2, float3 barycentric)
{
    return barycentric.z * v0 + barycentric.x * v1 + barycentric.y * v2;
}

float4 BarycentricInterpolate(float4 v0, float4 v1, float4 v2, float3 barycentric)
{
    return barycentric.z * v0 + barycentric.x * v1 + barycentric.y * v2;
}

Good luck.

Justin Stenning | Blog | Book - Direct3D Rendering Cookbook (using C# and SharpDX)

Projects: Direct3D Hook, EasyHook, Shared Memory (IPC), SharpDisasm (x86/64 disassembler in C#)

@spazzarama

 

@phantom: what i tried to say, is that each frame is taking longer time to render.

@spazzarama: thanks for your answer, it makes sense what you said about domain shader always will be called, but geometry is still optional, so i guess when using tess, geometry is not needed anymore like for nothing lol, thanks for the code examples too.

@phantom: what i tried to say, is that each frame is taking longer time to render.


Well, yes, that is to be expected because you've activated an extra stage in the pipeline - my question was because fps drops are non-linear so you could be worrying over very little.


Well, yes, that is to be expected because you've activated an extra stage in the pipeline - my question was because fps drops are non-linear so you could be worrying over very little.

Well right now still with no normal calculations and that stuff im like at 300 fps, and adding the geometry stage doing almost nothing, it goes down to like 50 fps, i know fps are algorithmic and that stuff, but it is still alot to consider removing that stage definetly, thank you.

Now going back to spazzarama answer, i will be doing the new normal calculations on the domain shader, but now that i started thinking on how to do them, it is not possible to do smooth normals since i dont have access to adjency vertices, so i started looking for some information on how to do it, a lot of people tell me to use a "sobel map", but i have no idea on how to make it or use it, any ideas?

There's a GameDev post here about generating a normal map from a height map that I guess is along the lines of what you are after. If the height map is static you could just precalculate this before hand. Otherwise since you're in DX11 you could use a compute shader to perform the sobel edge detection (got an example of that in my book somewhere too if you get stuck), otherwise something along the lines of the linked post should get you going.

Justin Stenning | Blog | Book - Direct3D Rendering Cookbook (using C# and SharpDX)

Projects: Direct3D Hook, EasyHook, Shared Memory (IPC), SharpDisasm (x86/64 disassembler in C#)

@spazzarama

 


so i guess when using tess, geometry is not needed anymore like for nothing lol

Geometry shader can still be useful for doing dual-paraboloid or cube environment mapping

Justin Stenning | Blog | Book - Direct3D Rendering Cookbook (using C# and SharpDX)

Projects: Direct3D Hook, EasyHook, Shared Memory (IPC), SharpDisasm (x86/64 disassembler in C#)

@spazzarama

 

This topic is closed to new replies.

Advertisement