Jump to content

  • Log In with Google      Sign In   
  • Create Account

happymonkey

Member Since 15 Jul 2012
Offline Last Active Sep 06 2012 03:09 PM

Topics I've Started

depth test and fbo

05 September 2012 - 05:12 PM

I am in the process of implementing early ray termination using multipple interation. My colde is like this:
glClearDepth(1.0); //Set all the depth to the maximum
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex_dst, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, _vol_buffer_tex_mult_depth, 0);
glClear(GL_DEPTH_BUFFER_BIT);
glDepthMask(GL_TRUE);
glDepthFunc(GL_ALWAYS);
///////////////////////Shader to change the depth, draw a quad
glBindFramebuffer(GL_FRAMEBUFFER, 0);

I use: glClearDepth(1.0); glDepthFunc(GL_ALWAYS) to make the shader update the depth buffer.
It works right. The problem I met is when I change glClearDepth(1.0) to glClearDepth(0.0), there is no display anymore. It seems
that, at this case, the first part will fail in the depth test, why? I already used glDepthFunc(GL_ALWAYS) which should pass the test
at anytime. Thanks for any hint.

Depth Test and Early ray termination

05 September 2012 - 02:41 PM

I am in the process of implementing early ray termination using multipple interation. My colde is like this:

glClearDepth(1.0); //Set all the depth to the maximum
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex_dst, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, _vol_buffer_tex_mult_depth, 0);
glClear(GL_DEPTH_BUFFER_BIT);
glDepthMask(GL_TRUE);
glDepthFunc(GL_ALWAYS);
///////////////////////Shader to change the depth, draw a quad
glBindFramebuffer(GL_FRAMEBUFFER, 0);

///////////////////////////////////////////////////
glBindFramebuffer(GL_FRAMEBUFFER, _framebuffer);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, tex_dst, 0);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_TEXTURE_2D, _vol_buffer_tex_mult_depth, 0);
glDepthMask(GL_FALSE); //Do not change the depth value
glDepthFunc(GL_LEQUAL);
glColorMask(GL_TRUE, GL_TRUE, GL_TRUE, GL_TRUE);

/////////////////////Shader to change the color, draw a quad
glBindFramebuffer(GL_FRAMEBUFFER, 0);


I use: glClearDepth(1.0); glDepthFunc(GL_ALWAYS) to make the shader update the depth buffer.
It works right. The problem I met is when I change glClearDepth(1.0) to glClearDepth(0.0), there is no display anymore. It seems
that, at this case, the first part will fail in the depth test, why? I already used glDepthFunc(GL_ALWAYS) which should pass the test
at anytime. Thanks for any hint.

Negative Value in Texture using GL_SHORT

15 July 2012 - 11:15 AM

I have a question about the "type" parameters in function glTexImage2D. Since my original texture data is "short" and has negative value, I use Gl_SHORT first. But, it seems that the negative value in shader is clamped to zero. My solution is: first map the "short" to "unsigned short". Then, inside the shader, map the value back. This solution works.

My question is:

If GL_SHORT can only represent the value between [0, 32767] when the negative value is clamped, GL_UNSIGNED_SHORT with range [0, 65535] would be better choice at any cases, right? not just GL_SHORT but also GL_INT.

Is there some way I can pass the negative value without mapping to positive into the shader?

Thanks a lot.

PARTNERS