D3D9 Render texture/Fullscreen-quad issue (3x 800x600 embed images; beware cellphone/tablet users of potential dataplan usage)

Started by
3 comments, last by StakFallT 10 years, 6 months ago

I'm running into problems with render textures; the question I have can be summed up very briefly... what can cause this?

(I hope these images aren't too big... I had imageshack not re-size them down, because I wanted the texture on the cube to show, so that noone think it's solid black (I.e. no texture, bad material, no lighting, etc). -- Figuring most are on broadband of some kind, I'm hoping it's not much of an issue. If it is, I'll have imageshack re-size them down. I could always attach them, but then people can, and rightfully so, get squeamish about downloading attachments of any kind from any place and opening them. Thumbnails would be fine I guess, except they tend to be so tiny, you don't really know which one you want to open.

Screen cap of what I'm looking for:

4p83.jpg

Screen cap of what I actually see (Above was with modified code that didn't render the full-screen quad):

4f3z.jpg

hmmm... the texture is plastered right up in my face and fully lit... I thought the point of rendering "elements" (models, etc) to a texture was the texture looks -exactly- as rendered elements would if you weren't using render textures -- i.e. isolating rendered scene pieces that are eventually sent to the screen?

Screencap in wireframe (To show relation -- er lack of -- of the cube to the plastered in-your-face texture):

3fxx.jpg

I understand the question I asked is a fairly broad question; I figured I'd ask a higher level question, and I'd either get A) An answer, B) A question (or questions) as a reply, or C) A combination of both. Which is fine... just figured I'd kick it off with a short and sweet type of question and work from there. Thanks in advance!

btw, even though hardware based shaders and vert processing is enabled, I have the shaders disabled on those models. The title in the title bar just references that feature is enabled and if applicable, it'll use it.

--StakFallT

EDIT 1: Couple of spelling mistakes (find -> fine, light -> lit)

Advertisement

I'm not real clear on what you are asking...?

Are you happy with the dark textured cube? Sounds kind of like you are saying that the cube is correct like that? I can barely see there is some texture on it, so I would say you are missing some ambient adjustment because those are on sides opposite the light, or the cube's normals are messed up.

The cube mesh is a little overly complicate for a cube, although that shouldn't really matter if all your rendering is that cube (assuming the normals and lighting are correct).

As to why you got an in-your-face texture, I would guess that you rendered the quad with the cube's material still bound, not the render target you are meaning to display.

If you can see the cube when you don't render the quad then you rendered the cube to the back buffer, not a separate render target/texture. The screen would be blank (clear color), if you rendered it to a separate texture and then didn't paste the image back via the full screen quad.

Well yeah, in a way I'm happy with the dark cube -- it's textured, it's lit (though, yeah I agree, it's way too dark; I gotta play it a bit -- probably like you said the ambient stuff; or the light's normals, but it's lit), it's scaled, and positioned in the world properly.

The cube is deliberately overly complex, because rather than having to keep playing with the scale in Maya, I just made a more complex one, and use the scaling features in my engine. Scaling a larger cube with smaller amounts of details causes the spatial interpolation of lighting to be larger. In other words, if vertices, using the normals specified, are where the actual lighting models/algorithms are calculated internally by D3D, then the dispersal of light from one vertex to another can cause you to "run out of light" before it hits the next vert due to smooth linear falloff of the light from one vert tto another (even though that shouldn't occur, it seems to; I've seen an article somewhere along the line that actually mentioned the same thing). The quick dirty solution for testing was to just add more vertices ;)

That's strange. It shouldn't be. I actually have it setting the texture of the quad to the render texture before calling DrawPrimitive (Yes I know, the DrawIndexedPrimitive would be faster)

Well, in the screenshot where both are rendered, I have the model rendering twice, once for the texture, and another being rendered to the backbuffer just so I can flip on wireframe and demonstrate that the image your seeing (The one with the in-you-face texture) is not what I think should be.

currently the render texture is being set via a d3device's SetRenderTarget() call with the first parameter (DWORD RenderTargetIndex) being 0 for the RenderTargetIndex, and the texturestage for the quad being 1 (I had it at 0, but I just got an empty screen -- unless I rendered the model outside of the render texture building codeblock that is..

Are you looking at all of the HRESULT codes returned from the DX functions? Could be something hasn't been created to the satisfaction of DX.

Also create the device with the debug flag and DX will spew warnings for a lot of things. This can be really helpful.

You also might want to look at one of the graphic debuggers that are available. You can pause your app and analyze how a frame is being constructed. If using VS2012, there is one built in now. PIX if on older API (don't think it works with DX11). NSight if you have an nvidia card is a really nice. I forgot what AMD's version is called, but they have one too.

Personally, I would start with the debug output if you don't have it on.

Well I have D3D's debug settings cranked to full on everything I could find in the DirectX control pane; I have the runtime set to debug in the project settings, l think there's even a preprocessor I have set for d3d debugging and most things that do error, do display in the output window, so while I'm not completely sure every d3d call is having the returning HRESULT checked, I'm very confident that anything that does error will be displayed. So far the only things I really see are redundant states being set, and that's just a matter of finalizing my render state manager to check for previously set state values so it doesn't set them again. And none of the states should cause something like a texture splatted right up against the camera lens like that. As far as PIX goes, yeah I tried that at a couple of points when I was implementing my shader managers. Apparently, at least I think it might be, VMware (I know... I know... don't say it wink.png -- I don't have easy access to a pure windows machine; at least not one I trust a unreleased engine app on) doesn't behave well with it or something; everytime I tried PIX I ran into a problem. I think, trying to recall the problem from memory, it would hang the app and yield no results and then keep it open and then at that point when I would give up on PIX and go back to just static analysis, I would try some stuff and then go to run it again in MSVC and that would then error since the app executable was still open from a PIX hangup, and trying to close it in task manager didn't close it.

I'm using MSVC 2008, and running on an ATI card, so built-in vs2012 stuff and NSight unfortunately aren't possible options... I'm thinking the answer is some high level answer, like "Oh you didn't set the projection (Or the view) matrix equal to <blah>" or something as I think if there was something seriously wrong, I wouldn't even get a texture, or worse yet, the thing would just crash. Just I'm not seeing it in the code... and too many days, accumulated time, of looking over the code causes one to get tunnel vision as far as scrutiny over the logic in each of the routines.

This topic is closed to new replies.

Advertisement