Jump to content
  • Advertisement

KaiserJohan

Member
  • Content Count

    434
  • Joined

  • Last visited

Everything posted by KaiserJohan

  1. I have a large heightmap, split into X-sized patches (e.g 32), inserted into a quad-tree for view culling on the CPU. All rendered as instanced drawing of a simple 4-vertex quad with each patch has it's own world-transform (with uniform scale for all). The tessellation of each chunk depends on it's height variance & screen-space size. Fair enough, that works mostly fine, although there are two cases that are problematic: Performance, when zoomed out far enough that nearly all patches are visible Loss of detail, when not in a top-down view or very very zoomed in So I'm thinking of doing non-uniform sized patches like this image: A quad-tree like this is different from mine as it's built from the camera as the root so I need to change that. As for determining when to merge smaller patches into a larger one, is there a smart metric to use? The most simple one I can think of is measure distance from camera in absolute world units, but I would prefer something more adaptive perhaps, to handle really large & small terrains, and not dependent on such static boundaries. Any ideas?
  2. KaiserJohan

    Non-uniform patch terrain tessellation

    Interesting; do you have a github link for reference? I see, it's like how I do it currently, then. Just to be sure, you build & cull nodes on the CPU right? Also, how deep do you determine the quadtree should be, where do you stop splitting nodes? I have it set to 32x32 but is there a reason to do anything else? Given that there is quite abit of memory footprint on the CPU for each height in the quadtree Not quite sure I understand this part. Which size limit do you refer to? And what is the external storage media? Isn't it enough to do a unit-sized quad and then provide a scaling/translation transform for each patch? Thanks alot for the answers! ๐Ÿ™‚
  3. I am doing terrain tessellation and I have two ways of approaching normals: 1) Compute the normal in the domain shader using a Sobel filter 2) Precompute normals in a compute shader with the same Sobel filter and then sample it in the domain shader. Texture format is R10G10B10A2_UNORM This is the normals (in view space) from 1), which looks correct This is normals when sampled from the precomputed normal map: This is what the computed normal map looks like This is the sobel filter I use in the compute shader float3 SobelFilter( int3 texCoord ) { float h00 = gHeightmap.Load( texCoord, int2( -1, -1 ) ).r; float h10 = gHeightmap.Load( texCoord, int2( 0, -1 ) ).r; float h20 = gHeightmap.Load( texCoord, int2( 1, -1 ) ).r; float h01 = gHeightmap.Load( texCoord, int2( -1, 0 ) ).r; float h21 = gHeightmap.Load( texCoord, int2( 1, 0 ) ).r; float h02 = gHeightmap.Load( texCoord, int2( -1, 1 ) ).r; float h12 = gHeightmap.Load( texCoord, int2( 0, 1 ) ).r; float h22 = gHeightmap.Load( texCoord, int2( 1, 1 ) ).r; float Gx = h00 - h20 + 2.0f * h01 - 2.0f * h21 + h02 - h22; float Gy = h00 + 2.0f * h10 + h20 - h02 - 2.0f * h12 - h22; // generate missing Z float Gz = 0.01f * sqrt( max( 0.0f, 1.0f - Gx * Gx - Gy * Gy ) ); return normalize( float3( 2.0f * Gx, Gz, 2.0f * Gy ) ); } The simple compute shader itself: [numthreads(TERRAIN_NORMAL_THREADS_AXIS, TERRAIN_NORMAL_THREADS_AXIS, 1)] void cs_main(uint3 groupID : SV_GroupID, uint3 dispatchTID : SV_DispatchThreadID, uint3 groupTID : SV_GroupThreadID, uint groupIndex : SV_GroupIndex) { float3 normal = SobelFilter( int3( dispatchTID.xy, 0) ); normal += 1.0f; normal *= 0.5f; gNormalTexture[ dispatchTID.xy ] = normal; } The snippet in the domain shader that samples the normal map: const int2 offset = 0; const int mipmap = 0; ret.mNormal = gNormalMap.SampleLevel( gLinearSampler, midPointTexcoord, mipmap, offset ).r; ret.mNormal *= 2.0; ret.mNormal -= 1.0; ret.mNormal = normalize( ret.mNormal ); ret.mNormal.y = -ret.mNormal.y; ret.mNormal = mul( ( float3x3 )gFrameView, ( float3 )ret.mNormal ); ret.mNormal = normalize( ret.mNormal ); ----------------------------------------------- Now, if I compute the normals directly in the domain shader, different sampling method in the Sobel filter float3 SobelFilter( float2 uv ) { const int2 offset = 0; const int mipmap = 0; float h00 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( -1, -1 ) ).r; float h10 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( 0, -1 ) ).r; float h20 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( 1, -1 ) ).r; float h01 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( -1, 0 ) ).r; float h21 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( 1, 0 ) ).r; float h02 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( -1, 1 ) ).r; float h12 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( 0, 1 ) ).r; float h22 = gHeightmap.SampleLevel( gPointSampler, uv, mipmap, int2( 1, 1 ) ).r; float Gx = h00 - h20 + 2.0f * h01 - 2.0f * h21 + h02 - h22; float Gy = h00 + 2.0f * h10 + h20 - h02 - 2.0f * h12 - h22; // generate missing Z float Gz = 0.01f * sqrt( max( 0.0f, 1.0f - Gx * Gx - Gy * Gy ) ); return normalize( float3( 2.0f * Gx, Gz, 2.0f * Gy ) ); } And then just computing it in the domain shader: ret.mNormal = SobelFilter( midPointTexcoord ); ret.mNormal = mul( ( float3x3 )gFrameView, ( float3 )ret.mNormal ); ret.mNormal = normalize( ret.mNormal ); I am sure there is a simple answer to this and I am missing something... but what? Whether I sample a precomputed value or compute it in the shader, it should be the same?
  4. That's it! I sampled only the red channel by mistake. Good catch, works now as expected, big thanks! ๐Ÿ™‚
  5. The transform is the same, whether I compute it or sample the normals ๐Ÿค” Is there anything that could happen under the hood when sampling? The texture is bound correctly (as seen in RenderDoc). Is the sampler object the problem?
  6. [domain("quad")] DomainOut ds_main(PatchTess patchTess, float2 uv : SV_DomainLocation, const OutputPatch<HullOut, 4> quad) { DomainOut ret; float2 topMidpointWorld = lerp( quad[ 0 ].mWorldPosition.xz, quad[ 1 ].mWorldPosition.xz, uv.x ); float2 bottomMidpointWorld = lerp( quad[ 3 ].mWorldPosition.xz, quad[ 2 ].mWorldPosition.xz, uv.x ); float2 midPointWorld = lerp( topMidpointWorld, bottomMidpointWorld, uv.y ); float2 topMidpointTexcoord = lerp( quad[ 0 ].mTexcoord, quad[ 1 ].mTexcoord, uv.x ); float2 bottomMidpointTexcoord = lerp( quad[ 3 ].mTexcoord, quad[ 2 ].mTexcoord, uv.x ); float2 midPointTexcoord = lerp( topMidpointTexcoord, bottomMidpointTexcoord, uv.y ); const int2 offset = 0; const int mipmap = 0; ret.mNormal = gNormalMap.SampleLevel( gLinearSampler, midPointTexcoord, mipmap, offset ).r; ret.mNormal *= 2.0; ret.mNormal -= 1.0; ret.mNormal = normalize( ret.mNormal ); ret.mNormal.y = -ret.mNormal.y; ret.mNormal = mul( ( float3x3 )gFrameView, ( float3 )ret.mNormal ); ret.mNormal = normalize( ret.mNormal ); float y = quad[ 0 ].mWorldPosition.y + ( SampleHeightmap( midPointTexcoord ) * gHeightModifier ); ret.mPosition = float4( midPointWorld.x, y, midPointWorld.y, 1.0 ); ret.mPosition = mul( gFrameViewProj, ret.mPosition ); ret.mTexcoord = midPointTexcoord; return ret; } This is the full domain shader - if I output the texcoord to an output texture in a pixel shader I see it does go from [0,1] for the whole terrain. The normal map is the same size and all as the heightmap. I do use RenderDoc (it's awesome) for debugging stuff like this ๐Ÿ™‚ The vertex shader that computes the texture coordinate & vertex positions are like this: VertexOut vs_main(VertexIn input) { VertexOut ret; const uint transformIndex = gTransformOffset + input.mInstanceID; // silly that we have to transpose this... const float4x4 worldTransform = transpose( gWorldTransforms.Load( transformIndex ) ); ret.mWorldPosition = mul( worldTransform, float4( input.mPosition, 1 ) ).xyz; ret.mTexcoord = ( ret.mWorldPosition.xz - gWorldMin ) / ( gWorldMax - gWorldMin ); ret.mTexcoord = clamp( ret.mTexcoord, 0.0f, 1.0f ); return ret; }
  • Advertisement
ร—

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net isย your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!