Pixel Shaders with ID3DXSprite

Started by
2 comments, last by jollyjeffers 19 years, 4 months ago
Is it possible to use a pixel shader with a the sprite interface? I've got a 2d game and I'd like to do some effects on the entire screen and I could easily do it in a pixel shader but I don't know how the sprite interface works. Do they use geometry to attach the shaders to?
Advertisement
As far as I know, pixel and vertex shaders are the last step in the Direct3D pipeline so it doesn't matter WHAT you'll render; as long as there are some vertices and (pix/tex)els to modify.
Ethereal
Render-to-Texture, and render that texture on a quad with your own Pixel Shader!

I'm not sure how per-sprite would work though. Anyway, I think you should try setting the Pixel Shader at the DrawTransform() call. (Or whatever call to render that sprite, I forgot the name)
Quote:Do they use geometry to attach the shaders to?
I believe it's setup as a TL quad. I think it's explained in the SDK help files somewhere - I vaguely remember reading an entry about how the pixel coordinates are computed for a sprite.

If it's TL geometry, you can't be using VShaders obviously.

I can't think of a good reason (or remember seeing one) as to why you can't use pixel shaders, but it is useful to note that ID3DXSprite maintains it's own set of states internally, and may well disable pixel shaders (I believe it reconfigures alpha blending in some cases).

If you can find it, I think this was discussed on the microsoft.win32.programmer.directx.graphics newsgroup a while back - might find a good explanation in there.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

This topic is closed to new replies.

Advertisement