Jump to content
  • Advertisement
Sign in to follow this  
TheKrust

render targets (2 questions, answer whichever you feel like)

This topic is 3857 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

#1 Disguarding Anti-alias, when I draw a polygon onto the screen in my engine (using the back buffer) it comes out very smooth provided I have it on LCD resolution. However, I need to use render targets so I can apply post process effects to the screen. One problem I'm seeing is that the polygons, when drawn on a render target, have very pixelated edges. The textures don't seem that effected (maybe it's just the absense of sharp lines though). Anyway, I'm just wondering what I can do to get the same quality on a redner target as I do on the back buffer. --------------------------------------------------------------- #2 I know that the max size of a render target is dictaded by the size of the screen or window you're drawing on. Obviously, I need a target that goes across the whole screen, but if I make the target larger than 640 x 530 for an 800 x 600 screen, it goes completley back and nothing draws. Any solution or alternative technique I need to use?

Share this post


Link to post
Share on other sites
Advertisement
#1: If you want offscreen MSAA render targets and you want them to work with post-processing effects, the only way I know of to get that to work (under D3D9) is to create a surface with Device::CreateRenderTarget() (passing in the appropriate flags), and then call Device::StretchRect() to copy the surface to a texture (well, actually some surface of a texture).

I use that approach for my post-processing effects and it works pretty well. The NVidia programming guide has a page about post-processing with MSAA under D3D9 if my explanation wasn't too clear.

Share this post


Link to post
Share on other sites
Render target sizes are not related to screen sizes. Your render target likely needs you to make an appropriately sized depth buffer to go along with it. CreateDepthStencilSurface will make what you need, and SetDepthStencilSurface will activate it. You'll likely want to grab the device's depth surface when you create or reset the device using GetDepthStencilSurface. You'll need to set that activate again when you change render targets. You probably also want to SetViewport after setting your render target surface.

Share this post


Link to post
Share on other sites
Quote:
You probably also want to SetViewport after setting your render target surface.


Slightly OT, but is this really necessary? I never do this, and everything works fine. Also, the SDK docs says:

(from the "IDirect3DDevice9::SetRenderTarget" page)
Quote:
Setting a new render target will cause the viewport (see Viewports and Clipping (Direct3D 9)) to be set to the full size of the new render target.


I've seen some source code that always calls SetViewport(), so I was wondering if there was a good reason. Sometimes the docs can be wrong and all, so I'm not sure if I'm relying on unsupported functionality. Though either way I suppose it couldn't hurt at all.

Share this post


Link to post
Share on other sites
That's not in my SDK help (Aug06 because we require compatability with ancient OSes and ps.1.1), however I see it has been re-added to the latest help online. The last time I saw that in the SDK help was DirectX8.1. I was surprised when it disappeared, but it definately went away. I figured it disappeared somehow because of MRT and having to specify a render target index. Despite what the help claims, these days, I'm not sure what it does to your viewport, but it definately can break it.

We had a game in development that wasn't working right and we tracked it down to a lack of SetViewport. Due to some options being off, a pass was skipped which resulted in the following flow:

RenderTarget 1 being set
The viewport being set to full surface
(skipped code due to option being off)
RenderTarget 1 being set again.

At this point, rendering didn't work. If D3D9/driver left the viewport alone, all would have been okay. If D3D9/driver set the viewport to the full surface, all would have been okay. Unfortunately, D3D9/driver did *something* unknown to the viewport. So yes, it can break. Whether it's a D3D9 bug, an nVidia driver bug, or whatever, ALWAYS SetViewport after setting your rendertarget.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!