Jump to content
  • Advertisement
Sign in to follow this  
MARS_999

ARB_shadow and FBO's

This topic is 4501 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am under the assumption that when I use FBO's or glCopyTexImage2D() I still need to use the ARB_shadow functions for my shadowmaps? Or my values will be 0-1 range and not 0 or 1... So as of if I use the ARB_shadow functions when I go to render the depthmap to see the values I get nothing with the ARB_shadow setup, but if I take it out I get a gradient rendered depthmap... Do I need to shutdown that texture object after I render to it to use it again as a texture to display the results? Thanks

Share this post


Link to post
Share on other sites
Advertisement
If you want to display the shadowmap for debugging (just to see the gradient), then you have to set COMPARE_MODE to NONE. If you want to use the shadowmap for shadows (projected from the light source) you have to set COMPARE_MODE to COMPARE_R_TO_TEXTURE. If you use the COMPARE_R_TO_TEXTURE when you are displaying the shadowmap for debugging (just a 2D quad on the screen) you won't get correct results, because the compare function isn't correct.

HellRaiZer

EDIT : I don't think this has to do with FBOs or the way you create the shadowmap.

Share this post


Link to post
Share on other sites
Well with FBO's on Nvidia anyway if I use the ARB_shadow functions I can't see the texture on a 2D quad to check the values, but renders a true/false image across my entire terrain, now if I don't use them I get a 0-1 range for my shadows depending on the distance from the lightsource... So I am confused why this is, I need both methods one for my water, and the shadowmapping I need 0 or 1 not 0-1 for values... Can anyone help me figure this out as to why this is, and what I need to do to render my image when I use the arb_shadow functions that return 0 or 1... Thanks

Update: this behavior is the same with glCopyTexImage2D also...


BTW, I know you need to bind your FBO when you want to draw to or read from, but lets say you just want to use the texture to display on a polygon? I am as of now not binding the FBO but just calling glBindTexture() with the texture object from my FBO and is working. Is this correct behavior for displaying the RTT textures from a FBO? If not please correct me. Thanks again. ;)

[Edited by - MARS_999 on April 21, 2006 7:34:07 PM]

Share this post


Link to post
Share on other sites
Quote:

BTW, I know you need to bind your FBO when you want to draw to or read from, but lets say you just want to use the texture to display on a polygon? I am as of now not binding the FBO but just calling glBindTexture() with the texture object from my FBO and is working. Is this correct behavior for displaying the RTT textures from a FBO?

Yes. Think of it the other way. When you bind a framebuffer object all rendering commands are being sent to this framebuffer. So when rendering anything just have the correct framebuffer bound and you will be fine. Just don't render to the framebuffer that is used to create the texture (this is, the fbo that the texture is still attached to) because reading and writing to the same buffer at the same time may end up messing things. I don't know how drivers can handle that.

I haven't played with shadowmapping for a while so i don't know if something has changed. But i would expect the following pseudocode to work :


// Render to shadowmap
BindFrameBuffer(ShadowmapFBO);
AttachDepthTexture(ShadowmapTexture);
RenderScene();

// Render to screen
BindFrameBuffer(0);

// Render your scene with the shadowmap projected on it
BindTexture(ShadowmapTexture);
SetCompareMode(COMPARE_R_TO_TEXTURE);
SetupTextureMatrix();
RenderScene();
UnsetupTextureMatrix();

// Render your shadowmap on a 2D quad for debugging
BindTexture(ShadowmapTexture);
SetCompareMode(NONE);
RenderQuad2D();



What i'm trying to say is that if you want to use the shadowmap for both debugging display and normal use, you have to set the compare mode every time you are going to use it.

And something last. When you are displaying the shadowmap for debugging, try not to use a fragment program, because i don't know how it will handle the depth compare. I'd expect it to work according to the compare mode, but i'm not sure.

Hope that helps.

HellRaiZer

Share this post


Link to post
Share on other sites
Hmm this all sounds like what I am already doing, but I am using GLSL to render my shadowmap and renders fine, except that if I don't use the ARB_Shadow functions with the FBO I get a depth map that looks like a gradient greyscale texture with values 0-1, where if I do use the ARB_shadow functions I get a texture that maps the values as either 0 or 1. Now the funny thing is with my Nvidia card(I don't have a ATI card) when I use the arb_shadow functions and use GL_NONE to render to a quad I can't see any values at all (black) but all white values, but the shadowmap renders fine still when used in GLSL. This behavior is the same for FBO's and glCopyTexImage2D().... I am just wondering if this is correct or am I missing something. Thanks

Share this post


Link to post
Share on other sites
Quote:

Now the funny thing is with my Nvidia card(I don't have a ATI card) when I use the arb_shadow functions and use GL_NONE to render to a quad I can't see any values at all (black) but all white values, but the shadowmap renders fine still when used in GLSL.


Sorry for my bad english. I finally understood your problem.
Are you using LUMINANCE or ALPHA for DEPTH_TEXTURE_MODE?
If you are using ALPHA, i think this is normal.
If you are using LUMINANCE then i suspect there is something wrong with the whole thing.
As you say, if you can see a clear gradient when not using the ARB_shadow compare functions, then there must be something wrong.
Sorry but i don't know what may be wrong. Maybe nothing, but i'd expect it to work normally as if you don't use ARB_shadow.

Maybe someone else can help you better than me.

HellRaiZer

Share this post


Link to post
Share on other sites
What happens if you scale the depth values by hand to the [0, 1] range? Do you see a gradient?

What i mean is, read the values to a buffer (glReadPixels() from the FBO or glGetTexImage() from the texture), find the min and max, and scale all the values so the min is mapped to 0 and max to 1.

If you can see the gradient after the scaling, then there must be a driver problem or something. I don't remember what the specs say about this (depth values to color) but i think you should try it. Just in case to reject this possibility.

HellRaiZer

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!