Sign in to follow this  

Rendertarget texture advices for compatibility

This topic is 4307 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi! I have a rendertarget texture, I render the scene in, then I draw it as a simple quad. It all works for me, but I want some advices to get best compatibility with other video boards. I use DX8. 1. I create the texture target with the same size as the display (that is non power of 2, and not square). Is this ok? 2. I use A8R8G8B8 when test caps for render target textures because I need it in 24/32bits. Should I check for X8R8G8B8 too? Do all video bords support this even if the display is in 16 bits mode ? 3. I do not lock the render target texture's surface directly, but I use CopyRect to another temporary surface created with the same pixel format and the same size. Can this one be of a smaller size (as big the rectangle I need to copy - on my GeForce6600 it works) ? It is not a problem of speed, but of compatibility - that's why I did not tried to lock the render target directly. 4. I lock the temporary surface and extract the rendered bitmap I needed. 5. What about the depth buffer? Should I use the old one (from the display), or create a new one especially for this target surface. Do they need to match the size (in case I need to make the target texture power of 2, but the display, and the depth buffer, was non-power of 2)? 6. Also, as a weird thing I've expereinced on a Sys video board, the rendered bitmap I was finally getting after rendering the scene, had slightly different colors from the one it should have. For example, if I paint something red (0xFF0000) it would result in 0xFE0000. No alpha blending was used. And so it happends with the alpha that was failing a greater and equal alpha test with 255, but succeded with an alpharef=254 !!! Very weird !!! Does anyone knows about something this issue ? Thanks

Share this post


Link to post
Share on other sites
1. It depends as always with Direct3D. At first you should always check all Caps that are required to support what you are doing at program start before you create the device. If a card doesn’t support anything you need show a message box and quit. As Dave already have written not all cards support non-power of two textures.
2. It will not hurt to check, as there are some cards that support X8R8G8B8 as render target but not A8R8G8B8.
5. As long as your render target is smaller than your depth buffer it will work with the display depth buffer.
6. Don’t expect that your graphics chips be bit exact. I know this can be hard but this is the truth.

PS: If this is a new project you should think about using Direct3D 9.

Share this post


Link to post
Share on other sites

This topic is 4307 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this