Jump to content
  • Advertisement
Sign in to follow this  
Zipster

[DX10] :: Binding textures/samplers to registers

This topic is 3987 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm in the process of converting our effects over to DX10 and am having problems with textures and samplers. Some of our more basic shaders bind their primary sampler to register s0, and that's what the DX9 engine sets using SetTexture. However I can't get this to work properly in DX10. I know that textures and samplers are different entities now, but can I still bind them to registers (or "slots" in DX10)? When I compile the shaders in FXC it tells me that my texture is in slot 0, yet when I try to set that slot to my texture resource using PSSetShaderResources, all I get is a black texture. This same texture works in another shader that explicitly gets a ID3D10EffectVariable interface and does SetResource. So my question is, what is the proper way to emulate the register binding semantics in DX10?

Share this post


Link to post
Share on other sites
Advertisement
HLSL register binding works the same way with Direct3D 10 as with 9. Have you enable the debug layer? It can give you useful hints if something goes wrong. I would give PIX a try to. At least you can check if anything is bind to the right place during your drawcall.

Share this post


Link to post
Share on other sites
FXC is telling me that my texture is being bound to slot 0 (in the comments it generates), and that my sampler is being bound to slot 0 as well. Are there separate slots for each data type, or is two resources being bound to the same slot a bad thing? This happens even without explicit register binding.

Basically what I'm trying to accomplish is allowing the engine code to set a texture that the shaders use directly via the device (i.e. not using the effect system). In DX9, what you would do is set the texture in C++ using SetTexture() and then in HLSL have a sampler that doesn't have a texture assignment, but was bound to the same sampler register you set in code. What would happen is that when you commit the changes, the effect system wouldn't override the texture already set in the device. Example:


// C++
pDevice->SetTexture(0, m_pTexture);

// HLSL
texture SomeTexture;

sampler TheSampler : register(s0) = sampler_state
{
// intentionally omitted, otherwise SomeTexture, which isn't set by code through the effect system,
// would override the SetTexture call to stage 0 with some black/uninitialized texture
// texture = (SomeTexture);
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = LINEAR;
};




However in DX10, if you try to use the old tex2D* intrinsics on a sampler that doesn't have a texture assignment (even with backwards compatibility enabled), the compiler chokes with a "failed to load & disassemble effect" error (the above code would fail to compile in other words). So here's the problem - the code sets the sample stage directly on the device (in DX10 it's the slots), but if a sampler has a texture assignment, it overrides the device texture with whatever is set via the effect system (ID3DXBaseEffect::SetTexture). Since the compiler doesn't allow me to omit this texture assignment without keeling over, I've in a bit of a bind.

Share this post


Link to post
Share on other sites
Sampler and resources use two different slot types.

If you want not use the effect system for these basic shaders you should not crate an effect for them. Instead you should create your shaders as separate objects. Then you can configure the pipeline in any way you want.

Another way would be to set the texture after the Apply call. Finnaly you can use a share effect variable that contains this texture and set this variable instead of setting the texture direct to the device.

Share this post


Link to post
Share on other sites
I'm afraid I have little help to offer, but I was curious why you're not using the Effect interface to work with shaders (moreso since that does seem to work). I'm not suggesting you should switch or anything, just wondering if it's mainly a performance issue or if there are other benefits to rolling your own shader system.

Share this post


Link to post
Share on other sites
We do use the effect system for everything, however the way our engine is designed there are some interfaces that don't expose the effects to the user (such as the ones for simple UI quad drawing), so they can't set variables using the effect system because it's been abstracted away above a certain level. However they do know that the effect samplers are bound to certain registers, so they set those directly through the device. The alternative would be to abstract that as well and go through the effect system, which is our next step if we can't get this to work.

Here's a simple test case that demonstrates the problem I'm having (it's an archive with a single FX file since GDNet wouldn't let me upload the FX file directly). Compile it with the commandline: fxc /Tfx_4_0 /Gec Test.fx. Works perfectly. Now comment out the line in the sampler that sets the texture. Suddenly it doesn't work any more. With that line still commented out, compile the shader with the commandlines: fxc /Tvs_4_0 /Evs_main /Gec Test.fx and fxc /Tps_4_0 /Eps_main /Gec Test.fx (in other words compile each shader seperately). Works again. Is this a bug, or does the compile use a completely different set of rules for the fx profile versus the vs/ps profiles?

Share this post


Link to post
Share on other sites
Supporting the old syntax is limited and is only meant for backwards compatibility. What it comes down to is you're interested in having Effects handle your sampler state, but you want to handle the texture (for whatever reason), right? You have a couple of options here:

1) set the texture after applying the effect, this way it will override whatever the effect said

2) use the new texture intrinsics (Texture2D MyTexture; SamplerState MySampler { ... }; ... MyTexture.Sample(MySampler, In.Tex); ...)

The only problem with using 2 is if you're writing a shader for DX9, your DX9 pipeline is going to have to deal with the fact that the compiler generates a sampler entry in the constant table that looks like MySampler+MyTexture, instead of just MySampler (in case it's used with multiple texture objects).

For compatibility purposes, we do allow you to do manual register allocation on these automatically generated objects as well. Just do a register semantic for the type you expect will be generated.

For example: SamplerState MySampler : register(s0) : register(t0); Will make sure that generated texture objects (if you use the sampler with an old texture intrinsic) go in register t0.

Or: Texture2D MyTexture : register(t0) : sampler(s0); Will make sure that when using a new texture intrinsic on an older target, the generated sampler (MySampler+MyTexture) will end up in s0.

Share this post


Link to post
Share on other sites
Thanks Jalibr for the helpful reply. It's only about a dozen or so of our effects that have the code set the texture for them, but they're some of the more important ones :)

I have one last question. If both the sampler and the texture are in slot 0, but in their own respective "buckets" if you will, does PSSetShaderResources know the correct slot buckets to use based on the supplied resource views? Right now the documentation seems to indicate that only textures are shader resources, but I'm just curious whether I can use this function for setting textures in lieu of SetTexture in DX9.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!