Sign in to follow this  

Passing Textures to HLSL Shader (finally get it working)

This topic is 837 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I almost got the DX9 version of the code to run a HLSL shader working. I reverted back to a version that was actually running the shader to figure out what had broken since cleaning up the code. This code DOES run the shader, when running against the backbuffer. When I uncomment the line "SetRenderTarget", it stops working, and now it makes sense why: the RenderTarget is then both the source and the destination of the shader!

 

https://github.com/mysteryx93/AviSynthShader/blob/master/VideoPresenter/D3D9RenderImpl.cpp

 

Something is smelly in the order of the textures, can you guys help me figure out how to straighten out the situation?

 

In function Present, I copy the memory texture into the graphic card memory.

HR(m_pDevice->StretchRect(m_InputTextures[0].Memory, NULL, m_pRenderTargetSurface, NULL, D3DTEXF_POINT));

 

Then in CreateSchene, I set that texture as the input of the shader.

HR(m_pDevice->SetTexture(0, m_pRenderTarget));

 

If I copy from m_InputTextures[0].Memory to m_InputTextures[0].Surface for both of these calls, it isn't working. But if I instead copy it to the rendertarget (which is a texture created as a rendertarget but that hasn't been defined with SetRenderTarget), and send that as the input of the shader, then it somehow works. It doesn't feel right, and might explain why it fails when that texture is the actual RenderTarget.

 

What do I have to do to pass another texture as the shader input? If I call StretchRect towards m_InputTextures[0].Surface, then it fails because that texture isn't a rendertarget. Does each shader input texture need to be created with the rendertarget flag? What about the vectors, in which textures do they need to be defined?

 

I believe if we can clarify this smelly mess, it should then be working. Thanks!

 

 

Edit: After investigating some more, StretchRect is the only place that points to the RenderTarget, so perhaps that has to be that way.

 

For those who know the theory of this, here's a very simple question: if you have a shader that takes 3 input textures and outputs 1 texture, SetTexture would have to be called 3 times. Do I need 1 or 3 StretchRect calls? If only 1, which one of the 3 textures should it copy? Perhaps it should be called for each input texture to copy them onto the graphic card memory, but then, where would be the call to draw onto the RenderTarget?

 

 

Edit2: Never mind, got it figured out! The RenderTarget gets filled automatically as the result of the scene, so I don't need to reference it. StretchRect on each input texture and it works. Now it works with SetRenderTarget and I'll be able to use other pixel formats :)

I should be good to go from there, thanks for all your help!

Edited by MysteryX

Share this post


Link to post
Share on other sites

This topic is 837 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this