[DX12] Texture2DArray descriptor used in shader as Texture2D

Started by
0 comments, last by Adam Miles 7 years, 7 months ago

Hello,

I wonder if what I am doing is dangerous/non-portable...

In my program, all my texture descriptors are texture array descriptors (most of the time they are arrays of only one slice).

In shaders, I do not always use Texture2DArray but simply use Texture2D assuming that the first slice is used.

I did not see any warning, glitch or problems but I know that usually means nothing and that kind of stuff needs to be verified.

So... Is it bad? Like crossing the streams?

Thanks!

Advertisement

A similar question was asked before regarding NULL descriptors and whether Texture2D descriptors were necessarily compatible with Texture2DArray descriptors. The answer was no, they're not. Will it work today? Probably. Will it always work? Maybe. Should you do it? No!

If you want to always create your descriptors as Texture2DArray descriptors why not just write that in HLSL too and explicitly sample from slice 0?

The team have told me in that past that GBV (GPU-Based Validation, added in RS1) will validate descriptors against their intended SRV_DIMENSION, so if you enable GBV you should get a validation error.

Adam Miles - Principal Software Development Engineer - Microsoft Xbox Advanced Technology Group

This topic is closed to new replies.

Advertisement