• 11
• 9
• 10
• 9
• 10
• Similar Content

• By lxjk
Hi guys,
There are many ways to do light culling in tile-based shading. I've been playing with this idea for a while, and just want to throw it out there.
Because tile frustums are general small compared to light radius, I tried using cone test to reduce false positives introduced by commonly used sphere-frustum test.
On top of that, I use distance to camera rather than depth for near/far test (aka. sliced by spheres).
This method can be naturally extended to clustered light culling as well.
The following image shows the general ideas

Performance-wise I get around 15% improvement over sphere-frustum test. You can also see how a single light performs as the following: from left to right (1) standard rendering of a point light; then tiles passed the test of (2) sphere-frustum test; (3) cone test; (4) spherical-sliced cone test

I put the details in my blog post (https://lxjk.github.io/2018/03/25/Improve-Tile-based-Light-Culling-with-Spherical-sliced-Cone.html), GLSL source code included!

Eric

• Good evening everyone!

I was wondering if there is something equivalent of  GL_NV_blend_equation_advanced for AMD?
Basically I'm trying to find more compatible version of it.

Thank you!

• Hello guys,

How do I know? Why does wavefront not show for me?
I already checked I have non errors yet.

And my download (mega.nz) should it is original but I tried no success...
- Add blend source and png file here I have tried tried,.....

PS: Why is our community not active? I wait very longer. Stop to lie me!
Thanks !

• I wasn't sure if this would be the right place for a topic like this so sorry if it isn't.
I'm currently working on a project for Uni using FreeGLUT to make a simple solar system simulation. I've got to the point where I've implemented all the planets and have used a Scene Graph to link them all together. The issue I'm having with now though is basically the planets and moons orbit correctly at their own orbit speeds.
I'm not really experienced with using matrices for stuff like this so It's likely why I can't figure out how exactly to get it working. This is where I'm applying the transformation matrices, as well as pushing and popping them. This is within the Render function that every planet including the sun and moons will have and run.
if (tag != "Sun") { glRotatef(orbitAngle, orbitRotation.X, orbitRotation.Y, orbitRotation.Z); } glPushMatrix(); glTranslatef(position.X, position.Y, position.Z); glRotatef(rotationAngle, rotation.X, rotation.Y, rotation.Z); glScalef(scale.X, scale.Y, scale.Z); glDrawElements(GL_TRIANGLES, mesh->indiceCount, GL_UNSIGNED_SHORT, mesh->indices); if (tag != "Sun") { glPopMatrix(); } The "If(tag != "Sun")" parts are my attempts are getting the planets to orbit correctly though it likely isn't the way I'm meant to be doing it. So I was wondering if someone would be able to help me? As I really don't have an idea on what I would do to get it working. Using the if statement is truthfully the closest I've got to it working but there are still weird effects like the planets orbiting faster then they should depending on the number of planets actually be updated/rendered.

• Hello everyone,
I have problem with texture

OpenGL [SOLVED] Depth buffer problem

This topic is 2742 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hello,

I'm nowadays developping a game using OpenGL.

I tried so hard to fix my depth buffer problem, but i can't do anything about that ;

Here is the screenshot of the game, without the problem (my boxes' depth are very low) :

Now, if i put some depth on my boxes...

(For those who would like an animation, i recorded the problem there :

Well, I paste some of my code, tell me if i'm doing something wrong. (Well, I guess I do...)

// Init functionvoid init (){    glutInitDisplayMode(GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH);    glEnable(GL_DEPTH_TEST);}// Drawing routinevoid draw (){    glClearColor(0.1, 0.1, 0.1, 1.0);    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);    glMatrixMode(GL_MODELVIEW);    glLoadIdentity();    gl_plateau_draw(gc->glPlateau);    glutSwapBuffers();}

Can you help me ?
If you need more code, I can copy / paste there.

Thank you ! :)

EDIT :
Oh ... I just noticed that switch
glEnable(GL_DEPTH_TEST);
and the
glutInitDisplayMode
was doing something better ...

There is still something wrong in the drawing of my boxes ; Sometimes my boxes are filled, sometimes not. (depends of the position of the camera)

[Edited by - Spl3en on October 19, 2010 2:41:38 PM]

Share on other sites
Try glDisable(GL_CULL_FACE). Polygons have a front and back side, and back-facing polygons are commonly culled away. Usually this is a good thing, and if you order your vertices correctly you will still get the correct behavior, unless you need to be able to view the same polygon from both directions.

Share on other sites
Thank you for the reply Erik, it's really appreciated :)

I didn't touch at culling anywhere else.

I still believe that it is a depth buffer problem :
As you can see, my tokens are visible, like if the boxes were invisible...

Share on other sites
Post the whole code.

Between
[source] and [/source]

Any what is the "depth range" and the "depth func"?

Share on other sites
Also try putting glEnable(GL_DEPTH_TEST); in your draw function.

Share on other sites
Quote:
 Also try putting glEnable(GL_DEPTH_TEST); in your draw function.

Thank you again for the advice, I didn't try this.
It seems to resolve the transparency problem, but it now does the same problem as the begginning... :-(

This one : Screenshot

Quote:
 Post the whole code.

Well, this is a school project, the whole source code length is almost 100ko (but only 20ko of OpenGL), so I uploaded this to Megaupload...
The files where I use openGL are prefixed by "Gl", and my draw routine is in GraphicContext.c.

I don't have so much hope that someone will analyze all my code to find the problem, but if someone do, be sure that someone in the world will thank you very very much :)

Quote:
 Any what is the "depth range" and the "depth func"?

I'm not sure of the meaning of your question, but i use that :
gluPerspective(30, width/height, 0.1, 10000);

I don't know how to use glDepthRange(), maybe that is the problem ?

EDIT :
I got a function to set the FPS to 60, the rendering changes if i modify the value of the max FPS ... I guess i'm doing wrong with the depth buffer and the FPS.

EDIT 2 :
Erm, i removed the FPS max value, and it didn't change anything, so it is surelly not that... :(

[Edited by - Spl3en on October 19, 2010 2:49:37 PM]

Share on other sites
Um... how about looking at the documentation? If you google those functions I'm positive you can find the necessary info about them.

Share on other sites
I'm sorry, english is not my native language, so I sometimes get some problems to understand exactly the documentation...
Anyway, I just found the problem in the OpenGL FAQ, "12.040 Depth buffering seems to work, but polygons seem to bleed through polygons that are in front of them. What's going on?"...

I changed my zNear and ZFar to 100.0 / 10000.0, and it finally works :)

I still don't perfectly understand the mechanisms of zNear and zFar, but now I know the problem, I'm going to try to understand more documentation about these variables.

Thank you very much szecs and Erik Rufelt !
I increased your user rating :)

Share on other sites
Quote:
 Original post by Spl3enI changed my zNear and ZFar to 100.0 / 10000.0, and it finally works :)I still don't perfectly understand the mechanisms of zNear and zFar, but now I know the problem, I'm going to try to understand more documentation about these variables.

The depth-buffer has limited precision, so if zNear is too small or zFar too large, then there are not enough bits to keep track of the possible depth values with high enough precision.
Since you get such large problems I suspect your depth-buffer is only 16 bits, where you probably want 24. I'm not sure how to specify the desired precision using GLUT though.
You can check the precision with:
GLint depthBits;glGetIntegerv(GL_DEPTH_BITS, &depthBits);