Sign in to follow this  

[Problem] Rendering to Surface : renders pixels that should be occluded

Recommended Posts

I am new to doing post-process effects and am trying to piece together what I have found thus far on the internet. When I render it to the quad, it draw the entire mesh without accounting for the depth buffer. Am I doing things in the wrong order? Pseudo of my render code: 1. I sort all my Renderable objects in a priority queue, the ones that are going to be rendered to the offscreen surface are given most priority. 2. only the player's mesh is identified to use the offscreen surface, so only one object at this time.

while (objects in queue)
  if (object is using the offscreen surface)
    d3ddevice->Clear(0, 0, D3DCLEAR_TARGET|D3DCLEAR_ZBUFFER, 0x00000000, 1.0f, 0);
    ... draw it ....
    if (object is the first one to NOT use the offscreen surface)

    ... draw it...

// now having rendered the surface to texture first and then 
// drawn the rest of the scene, I render the textured quad to the screen 
// in screen space

.. draw quad with the offscreen texture on it ..

Anyone see anything outright wrong with the way I am approaching this? Thanks for any input. Brandon

Share this post

Link to post
Share on other sites
I don't see that you're setting the device's texture via Device.SetTexture. Or the RenderTarget via Device.SetRenderTarget(int, surface).

Also, you're clearing the device twice and from what it looks like, the same surface.

Hope that helped.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this