[DX10] DXGI_FORMAT_R16G16B16A16_UINT problems

Started by
2 comments, last by n3Xus 15 years, 3 months ago
Hello, I'm having problems setting the values in a texture that has DXGI_FORMAT_R16G16B16A16_UINT type. When rendering the 3D model I set the values for this texture manually to 1 (for example, or 2 or whatever...) in the pixel shader, so that this value will get written to the texture that uses DXGI_FORMAT_R16G16B16A16_UINT format. But it doesn't get written to it. In another pixel shader (the one I use for the fullscreen quad), the values for this texture are all 0 (so I presume since I tried multiplying the final output color by the value stored in the DXGI_FORMAT_R16G16B16A16_UINT texture and it's black). This texture is bound with 3 others (multiple render targets). Any ideas? (I'd post some code but since all other texture work as they should I don't know what to post...). Does this format has some limitations (if I change the format to DXGI_FORMAT_R16G16B16A16_UNORM it works normally)?
Advertisement
bump
Ok, I haven't solved this but lets leave this then...

I have another question: how to read from a texture an int (0-255), write it to a render target and than read it from this render target in another render target BUT without converting it to float or clamping the int to [0,1]. Is this possible?

I'd like this so that I can check in the pixel shader for a certain RGB value (0-255) and do some branching based on it (I tried it with floats and it (partially) works, so I'd like ints).
Ok I solved the problem :P

I saved the .dds texture to 16.16 32-bit unsigned format and it works now :D


This topic is closed to new replies.

Advertisement