Jump to content
  • Advertisement
Sign in to follow this  
Shnoutz

DX12 [DX12] Texture2DArray descriptor used in shader as Texture2D

This topic is 661 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

 

I wonder if what I am doing is dangerous/non-portable...

 

In my program, all my texture descriptors are texture array descriptors (most of the time they are arrays of only one slice).

In shaders, I do not always use Texture2DArray but simply use Texture2D assuming that the first slice is used.

 

I did not see any warning, glitch or problems but I know that usually means nothing and that kind of stuff needs to be verified.

 

So... Is it bad? Like crossing the streams?

 

Thanks!

Share this post


Link to post
Share on other sites
Advertisement

A similar question was asked before regarding NULL descriptors and whether Texture2D descriptors were necessarily compatible with Texture2DArray descriptors. The answer was no, they're not. Will it work today? Probably. Will it always work? Maybe. Should you do it? No!

 

If you want to always create your descriptors as Texture2DArray descriptors why not just write that in HLSL too and explicitly sample from slice 0?

 

The team have told me in that past that GBV (GPU-Based Validation, added in RS1) will validate descriptors against their intended SRV_DIMENSION, so if you enable GBV you should get a validation error.

Edited by Adam Miles

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!