Render target and back buffer size

Started by
8 comments, last by renman29 17 years ago
I was trying to help a friend with a render target issue. I don't have access to the code though. The problem is this: he could create a big render target (texture), 1024x1024, but a rectangle of only 1024x768 would draw. The back buffer was also 1024x768! Coincidence? Maybe? The viewport was being set correctly, and it worked fine when using a render target of 768x768. Changing the back buffer size seemed to support my theory that there was some correlation. A possibly significant bit of information is that he was using a Radeon 9800 Pro. I actually remember hearing a long time ago that for some hardware you shouldn't use render targets bigger than your back buffer, but I don't think I ever caught why at the time. Anyone know anything about this? Any suggestions as to how to solve the problem? I was thinking maybe the problem is only for render targets that are also textures, and maybe he could render to a render target surface and copy it to a texture if his hardware didn't support rendering to bigger render targets.
Advertisement
Was he using a Z buffer? Did he create one large enough for the render target, or use the one created for the Z buffer?
That's a good point! I didn't think that he might just be using the back buffer's z buffer. I kind of thought your back buffer was required to be the same size as your render target or it would crash or something silly like that.
The render target and Z buffer must match in size (or the Z buffer must be bigger), it'll usually result in failed draw calls and a lot of debug spew if they don't match.

I think the restriction on textures bigger than the backbuffer is a relic from the good old DirectDraw days where it was a specific capability bit to support 'wide surfaces'. Now the restrictions are set as maximum height/width and potential rules about being a 2n dimension.

I had a Radeon 9800 for several years and had no problems with this sort of thing - I used render targets up to 2048x2048 (largest it'd support iirc) in several projects.

Check the debug spew if you can, make sure you're checking return codes - something might be silently failing. Run it against the reference rasterizer.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

You must SetViewport after the render target. And you must ALWAYS set it. In DX8 the viewport would default to the entire render target. It DX9 it defaults to *something*. Not the full target, and not what it was previously set to. I had a render target set, with the viewport set to cover the full surface. There was another call to set the render target to the exact same surface it already was set to (an optional middle step was disabled). Rendering failed. So, even though it was the exact same surface, the viewport settings were toasted.

And about the Z must be the same size or larger thing... that's what the D3D debug spew says, but I've never seen it work when the sizes were mismatched with Z being larger. On the XBox you can view memory and see how various operations (Clear vs. Draw) make different assumptions about how to treat a bigger Z buffer. These mismatched assumptions cause it to fail. I can only guess the reason I've seen it fail on PC is identical. It might be driver specific, but whatever the problem, I've learned to not trust mismatched targets and Zs.
Yea the viewport was the first thing I thought of and wasn't the issue. It worked fine with smaller textures.

I kind of assumed my friend had a separate depth buffer for the render target but that wasn't the case. I'm pretty sure it will work once he sets that up.
Make sure you are really selecting your render target and new depth buffer before you start rendering. Also, try setting mip-levels to 1 and/or disable mipmapping filter. I myself have used 1024x1024 render targets with screen resolution of 1024x768 without problems.

Also, I haven't actually updated viewport when switching render targets (heck, I think I've never ever changed it!) and the rendering worked properly both on the screen and render targets.
Hmm. I've the same problem. Strange - I've so far tried every combination suggested and so far no go. Weird thing is that even if my backbuffer(&depth) is set to a larger resolution(ie: 1280x1024), it still won't let me use rendertargets with sizes larger than 768x768.. I'm using a GeForce 7300. If I switch to the onboard 6100, it still does it.
I bet I'm doing something wrong in my code - I have to go to work right now - if anyone has any ideas - more than happy to hear it. I'll be back Saturday - if I can't figure it out by then, I'll try posting some code - and maybe we can compare notes that way?
I'm probably just doing something wrong. Thanks for the suggestions. :)
I think I understood how to solve this problem. So what should I do to create a new bigger depth buffer? Is that done by setting a new viewport (using SetViewport)?

Thanks
Hah! It works now! This is one strange voodoo ritual...

I think the key was to make sure the the backbuffer and depth buffer matched, and then to make sure the depth buffer matched the new render target. (I tried using one that was too big before).
This is the combination I used to get it working:
g_proto_textures = new LPDIRECT3DTEXTURE9[n_objects];//a=0; do {  D3DXCreateTexture(g_gpu,      g_texture_width, g_texture_height,      1, D3DUSAGE_RENDERTARGET,       g_common_format, D3DPOOL_DEFAULT,      &g_proto_textures[a]);// ...// ......bla bla bla.....// ...g_gpu->SetRenderState(D3DRS_ZENABLE,D3DZB_FALSE);g_gpu->SetRenderState(D3DRS_ZWRITEENABLE,FALSE);D3DVIEWPORT9 vp;LPDIRECT3DSURFACE9 surface=NULL;LPDIRECT3DSURFACE9 new_zbuffer=NULL;g_proto_textures[a]->GetSurfaceLevel(0,&surface);g_gpu->CreateDepthStencilSurface(g_texture_width,			g_texture_height,			D3DFMT_D16,			D3DMULTISAMPLE_NONE,			0,			TRUE,			&new_zbuffer,			NULL);g_gpu->SetDepthStencilSurface(new_zbuffer);vp.X      = 0;vp.Y      = 0;vp.Width  = g_texture_width;vp.Height = g_texture_height;vp.MinZ   = 0.0f;vp.MaxZ   = 1.0f;g_gpu->SetViewport(&vp);g_gpu->SetRenderTarget(0,surface));g_gpu->SetViewport(&vp);g_gpu->Clear(0,NULL,D3DCLEAR_TARGET,D3DCOLOR_ARGB(0,0,0,0),1.0f,0);g_gpu->Clear(0, NULL, D3DCLEAR_ZBUFFER, D3DCOLOR_ARGB(0,0,0,0), 1.0f, 0);g_gpu->BeginScene();g_gpu->SetFVF(TRANSFORMED_VERTEX);g_gpu->SetStreamSource( 0,vb3,0,sizeof(stTransformedVertex));g_gpu->SetTexture(0,NULL);g_gpu->SetMaterial(&g_materials[object->MaterialRef]);if (g_textures[material_index]!=NULL)  g_gpu->SetTexture(0,g_textures[material_index]);g_gpu->DrawPrimitive(D3DPT_TRIANGLELIST,                     object->vb_offset_m,object->NumFaces);g_gpu->EndScene();	g_gpu->Present(NULL,NULL,NULL,NULL);SAFE_RELEASE(surface);SAFE_RELEASE(new_zbuffer);g_gpu->SetDepthStencilSurface(g_zbuffer);LPDIRECT3DSURFACE9 BackBuf=NULL;g_gpu->GetBackBuffer(0,0,D3DBACKBUFFER_TYPE_MONO,&BackBuf);vp.Width  = g_display_width;vp.Height = g_display_height;g_gpu->SetViewport(&vp);g_gpu->SetRenderTarget(0,BackBuf);g_gpu->SetViewport(&vp);SAFE_RELEASE(BackBuf);g_gpu->SetRenderState(D3DRS_ZENABLE,D3DZB_TRUE);g_gpu->SetRenderState(D3DRS_ZWRITEENABLE,TRUE);


(btw - any advice on this code is welcome)
This seems to work fine. I thank everyone very much! :)

This topic is closed to new replies.

Advertisement