Jump to content
  • Advertisement
Sign in to follow this  
Happy SDE

G-Buffer and Render Target format for Normals

This topic is 900 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I’ve created texture for normal RT in Deferred Renderer: 

D3D11_TEXTURE2D_DESC dtd {width, height, 1, 1,              
DXGI_FORMAT_R11G11B10_FLOAT,
1, 0, D3D11_USAGE_DEFAULT, D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE, 0, 0};

_check_hr(device->CreateTexture2D(&dtd, nullptr, &m_rtNormal));

 It seems that normal have some artifacts:

[attachment=30093:Normals.png]

 

 Other render targets for now have 32-bit format DXGI_FORMAT_R8G8B8A8_UNORM.
 

I only can imagine 3 choices how to improve normals:
1) Use 1.5 RT for Normals: DXGI_FORMAT_R32G32_FLOAT or DXGI_FORMAT_R16G16_FLOAT 
    X and Y will be in one RT, and Z – in the other.
 

2) Use 64-bit Render targets. For Normals use 16-bit floats: DXGI_FORMAT_R16G16B16A16_FLOAT

 

3) Use some math to calculate Z-value on first X and Y. But I want to avoid this approach.

 

So here are the questions:
Are there other better solutions for floating point normal with better that 11/10-bit per float?

 

In cases 1 and 2 in second RT there will be unused 16 or 32-bit float value.
Is there a way to represent it as UINT for my own use? How it can be done?

 

Thanks in advance.

Edited by Happy SDE

Share this post


Link to post
Share on other sites
Advertisement

First, there is no sign bit. So I suppose negative values become positive or get clamped to 0. You definitely don't want that.
Second, normals are in the [-1; 1] range. You will get much better precision by using DXGI_FORMAT_R10G10B10A2_UNORM which gets you 9 bits for the value and 1 bit for the sign; vs this float format which uses 5 bits for mantissa and 5 for the exponent.

Looks like you made a poor choice of format.

Thank you, Matias!

DXGI_FORMAT_R10G10B10A2_UNORM removed all the artifacts.

 

Share this post


Link to post
Share on other sites

Thank you, Matias!
DXGI_FORMAT_R10G10B10A2_UNORM removed all the artifacts.

I'm glad it worked for you. Just remember that UNORM stores values in the [0; 1] range, so you need to convert by hand your [-1; 1] range to [0; 1] by doing rtt = normal * 0.5f + 0.5f (and then the opposite when reading)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!