Jump to content
  • Advertisement
Sign in to follow this  
nullsquared

OpenGL OpenGL Z-Buffer - Objects behind are drawn infront

This topic is 4526 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all! I've recently started programming with OpenGL with the book Beginning OpenGL Game Programming. Everything is fine and all, except for one thing. I made a little shape struct (BTW, this is C++) with a pure virtual to draw itself. So then I made a cube that draw itself:
Cube:
    Functions:
        draw() : Draws all sides at once
        start() : glBegin's GL_QUADS
        end() : glEnd's the quads
        draw_x() where x is:
            front, back, top, bottom, left, right: draws the x side of the cube
This is the code that I use it with:
glPushMatrix(); {
            glTranslatef(0.0f, 0.0f, -6.0f);
            glRotatef(rot.x, 1.0f, 0.0f, 0.0f);
            glRotatef(rot.y, 0.0f, 1.0f, 0.0f);
            glRotatef(rot.z, 0.0f, 0.0f, 1.0f);

            glw::vector s(1.0f, 1.0f, 1.0f);
            c.start(); {
                /*glColor3f(1.0f, 0.0f, 0.0f);
                c.draw_front(s);

                glColor3f(0.0f, 1.0f, 0.0f);
                c.draw_back(s);

                glColor3f(0.0f, 0.0f, 1.0f);
                c.draw_top(s);

                glColor3f(1.0f, 1.0f, 0.0f);
                c.draw_bottom(s);*/

                glColor3f(0.0f, 1.0f, 1.0f);
                c.draw_left(s);

                glColor3f(1.0f, 0.0f, 1.0f);
                c.draw_right(s);
                //c.draw(s);
            } c.end();
        } glPopMatrix();

I commented out some of the sides because, well, it's very odd. I could see the back through the front, and the left through the right! So then I left it be left and right, and boom! Even when the left was behind the right, it was drawn over it: Purple Over Teal! I'm totally baffled :(. I have initialized OpenGL like so:
// clear color
        //           red   green blue  alpha
        glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

        // clean the whole depth
        // glClearDepth(1.0f);

        // enable depth-testing
        glEnable(GL_DEPTH_TEST);

        // set up the kind of test:
        // <= (less than or equal to)
        // glDepthFunc(GL_LEQUAL);

Do I have to do this when I resize instead? No, right? Odd... Any ideas? THANKS in advance! And on a side note, like shown above, if I change the back color to say... gray (.5f, .5f, .5f) it is still black.... BTW, I'm using C++ as the language and SDL as my window-handler and input handler.

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by Kalidor
Are you asking for (and receiving) a depth buffer when you create your window?


I am confused - what do you mean by "asking for a depth buffer"? The way I initialize the window is this:

SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1); // set double-buffering

// create SDL window with OpenGL context
if (!SDL_SetVideoMode(width, height, bpp, SDL_OPENGL)) { // bpp == 32
return false;
}



So I'm not exactly sure. I've compared my code to some other code that uses SDL too and it's pretty much the same...

Maybe this is more of an SDL with OpenGL setup problem rather than an OpenGL problem?

Share this post


Link to post
Share on other sites
Quote:
Original post by relsoft
Did you call glClear GL_DEPTH_BUFFER_BIT ?


Yes:

void window::begin() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// clear the whole screen
// reset to origin
glLoadIdentity();
}



And I call it like so:

while (!event.quit_requested()) {
event.update();

// event stuff

window.begin();

// draw the cube

window.end();
}



BTW, window::end is simply SDL_GL_SwapBuffers();.

Share this post


Link to post
Share on other sites
Quote:
Original post by agi_shi
I am confused - what do you mean by "asking for a depth buffer"? The way I initialize the window is this:
*** Source Snippet Removed ***

So I'm not exactly sure. I've compared my code to some other code that uses SDL too and it's pretty much the same...

Maybe this is more of an SDL with OpenGL setup problem rather than an OpenGL problem?
You need to tell the windowing system that you want a depth buffer, and possibly the number of depth bits, when you are creating the window. In the Win32 API you do this by setting the cDepthBits member of the PIXELFORMATDESCRIPTOR to the number of depth bits you want (usually 24). I've never used SDL before, but it seems to request a depth buffer you have to set the SDL_GL_DEPTH_SIZE attribute to the number of depth bits you want before you set the video mode.

Share this post


Link to post
Share on other sites
Quote:
Original post by Kalidor
Quote:
Original post by agi_shi
I am confused - what do you mean by "asking for a depth buffer"? The way I initialize the window is this:
*** Source Snippet Removed ***

So I'm not exactly sure. I've compared my code to some other code that uses SDL too and it's pretty much the same...

Maybe this is more of an SDL with OpenGL setup problem rather than an OpenGL problem?
You need to tell the windowing system that you want a depth buffer, and possibly the number of depth bits, when you are creating the window. In the Win32 API you do this by setting the cDepthBits member of the PIXELFORMATDESCRIPTOR to the number of depth bits you want (usually 24). I've never used SDL before, but it seems to request a depth buffer you have to set the SDL_GL_DEPTH_SIZE attribute to the number of depth bits you want before you set the video mode.


Ah yes - ofcourse. I forgot about that. I was comparing with some other SDL-OpenGL code and it didn't have that also, so I forgot about it.

Thanks again, and rating++.

Share this post


Link to post
Share on other sites
Quote:
Original post by agi_shi
Quote:
Original post by Kalidor
Quote:
Original post by agi_shi
I am confused - what do you mean by "asking for a depth buffer"? The way I initialize the window is this:
*** Source Snippet Removed ***

So I'm not exactly sure. I've compared my code to some other code that uses SDL too and it's pretty much the same...

Maybe this is more of an SDL with OpenGL setup problem rather than an OpenGL problem?
You need to tell the windowing system that you want a depth buffer, and possibly the number of depth bits, when you are creating the window. In the Win32 API you do this by setting the cDepthBits member of the PIXELFORMATDESCRIPTOR to the number of depth bits you want (usually 24). I've never used SDL before, but it seems to request a depth buffer you have to set the SDL_GL_DEPTH_SIZE attribute to the number of depth bits you want before you set the video mode.


Ah yes - ofcourse. I forgot about that. I was comparing with some other SDL-OpenGL code and it didn't have that also, so I forgot about it.

Thanks again, and rating++.


Odd - I made that change and it still does the same thing! This is what the code looks like right now:

SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);

unsigned sdl_flags = SDL_OPENGL; // don't start from nowhere
if (flags & FULLSCREEN)
sdl_flags |= SDL_FULLSCREEN; // add fullscreen
if (flags & RESIZABLE)
sdl_flags |= SDL_RESIZABLE; // add a resizable window
if (flags & NOFRAME)
sdl_flags |= SDL_NOFRAME; // don't add a frame

// make sure the window is ok
if (!SDL_SetVideoMode(w, h, bpp, sdl_flags))
return false;


I'm confused - I have also tried setting other attributes like so:

SDL_GL_SetAttribute(SDL_GL_RED_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_GREEN_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_BLUE_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_ALPHA_SIZE, 8);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);


But no luck...

Any ideas?

Share this post


Link to post
Share on other sites
Quote:
Original post by agi_shi
Quote:
Original post by agi_shi
Quote:
Original post by Kalidor
Quote:
Original post by agi_shi
I am confused - what do you mean by "asking for a depth buffer"? The way I initialize the window is this:
*** Source Snippet Removed ***

So I'm not exactly sure. I've compared my code to some other code that uses SDL too and it's pretty much the same...

Maybe this is more of an SDL with OpenGL setup problem rather than an OpenGL problem?
You need to tell the windowing system that you want a depth buffer, and possibly the number of depth bits, when you are creating the window. In the Win32 API you do this by setting the cDepthBits member of the PIXELFORMATDESCRIPTOR to the number of depth bits you want (usually 24). I've never used SDL before, but it seems to request a depth buffer you have to set the SDL_GL_DEPTH_SIZE attribute to the number of depth bits you want before you set the video mode.


Ah yes - ofcourse. I forgot about that. I was comparing with some other SDL-OpenGL code and it didn't have that also, so I forgot about it.

Thanks again, and rating++.


Odd - I made that change and it still does the same thing! This is what the code looks like right now:
*** Source Snippet Removed ***
I'm confused - I have also tried setting other attributes like so:
*** Source Snippet Removed ***
But no luck...

Any ideas?


So I went ahead and checked if I got what I wanted and I got 24 bits for the depth buffer... But I still get wrong Z-Buffering...

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!