Weird AGL buffer swapping

Started by
2 comments, last by KumGame 17 years, 10 months ago
Hi folks.. I'm using iMac4 ( Intel Core Duo with Mac OS X 10.4.6 and ATI card ) and I create an AGL context having following attributes Attributes[0] = AGL_RGBA; Attributes[1] = GL_TRUE; Attributes[2] = AGL_DOUBLEBUFFER; Attributes[3] = GL_TRUE; Attributes[4] = AGL_DEPTH_SIZE; Attributes[5] = 16; Attributes[6] = AGL_RENDERER_ID; Attributes[7] = 137473; // ATI Radeon X1600 Attributes[8] = AGL_NONE; My application crashes (PC hangs - I have to restart the PC again). I tried to debug it. But debugger points error for swapping the buffers. aglSwapBuffers(Context); I tried using Software Renderer. It works fine!! But I have to make use the resource available (Hardware Renderer) Any idea???
Advertisement
I am going to say it may have something to do with your attribute[7]. IIRC you don't need to set this. IMO you should ditch carbon and run with Cocoa... Using Cocoa is so much easier to get OpenGL up and running. You can use C++ with Cocoa also. It's been awhile since I coded on a Mac so I am a bit rusty...
I think that you don't have to include a GL_TRUE after the AGL_RGBA and AGL_DOUBLEBUFFER flags.
So it should be like this:

Attributes[0] = AGL_RGBA;
Attributes[1] = AGL_DOUBLEBUFFER;
Attributes[2] = AGL_DEPTH_SIZE;
Attributes[3] = 32;
Attributes[4] = AGL_NONE;

I don't think that you should specify your renderer id directly, I'd rather leave it to aglChoosePixelFormat.

Thanks guys..

Initially, I had no GL_TRUE in my code. It din work. So I tried to forcibly make it TRUE.
I tried without specifying the renderer id. Sadly, no improvement.

I got the porblem only if I render any .obj file (with lots of vertices)

I'm really fed-up. But fontunately, that problem has nothing to do with my current work. I just tried it ...

This topic is closed to new replies.

Advertisement