Jump to content
  • Advertisement
Sign in to follow this  

Passing Textures to HLSL Shader (finally get it working)

This topic is 1146 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I almost got the DX9 version of the code to run a HLSL shader working. I reverted back to a version that was actually running the shader to figure out what had broken since cleaning up the code. This code DOES run the shader, when running against the backbuffer. When I uncomment the line "SetRenderTarget", it stops working, and now it makes sense why: the RenderTarget is then both the source and the destination of the shader!




Something is smelly in the order of the textures, can you guys help me figure out how to straighten out the situation?


In function Present, I copy the memory texture into the graphic card memory.

HR(m_pDevice->StretchRect(m_InputTextures[0].Memory, NULL, m_pRenderTargetSurface, NULL, D3DTEXF_POINT));


Then in CreateSchene, I set that texture as the input of the shader.

HR(m_pDevice->SetTexture(0, m_pRenderTarget));


If I copy from m_InputTextures[0].Memory to m_InputTextures[0].Surface for both of these calls, it isn't working. But if I instead copy it to the rendertarget (which is a texture created as a rendertarget but that hasn't been defined with SetRenderTarget), and send that as the input of the shader, then it somehow works. It doesn't feel right, and might explain why it fails when that texture is the actual RenderTarget.


What do I have to do to pass another texture as the shader input? If I call StretchRect towards m_InputTextures[0].Surface, then it fails because that texture isn't a rendertarget. Does each shader input texture need to be created with the rendertarget flag? What about the vectors, in which textures do they need to be defined?


I believe if we can clarify this smelly mess, it should then be working. Thanks!



Edit: After investigating some more, StretchRect is the only place that points to the RenderTarget, so perhaps that has to be that way.


For those who know the theory of this, here's a very simple question: if you have a shader that takes 3 input textures and outputs 1 texture, SetTexture would have to be called 3 times. Do I need 1 or 3 StretchRect calls? If only 1, which one of the 3 textures should it copy? Perhaps it should be called for each input texture to copy them onto the graphic card memory, but then, where would be the call to draw onto the RenderTarget?



Edit2: Never mind, got it figured out! The RenderTarget gets filled automatically as the result of the scene, so I don't need to reference it. StretchRect on each input texture and it works. Now it works with SetRenderTarget and I'll be able to use other pixel formats :)

I should be good to go from there, thanks for all your help!

Edited by MysteryX

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!