glGetIntergerv and glGetError behaving badly

Started by
0 comments, last by Brother Bob 17 years, 5 months ago
Hi, I'm trying to use glGetIntegerv to get the number of bit planes for the color GL_BLUE_BITS, GL_GREEN_BITS, GL_RED_BITS, but whenever I pass an integer to this, it doesn't change my value. Also when I try to use glGetError() to get the error string, it never properly clears the error, so running a loop to clear them results in an infinite loop. Oddly enough, before opengl even opens I have an invalid operation on my hands. Here's the code for the two parts I've been using: GLint red, green, blue; GLenum tempError; red = green = blue = 0; while ( (tempError = glGetError()) != GL_NO_ERROR ) cerr << gluErrorString(tempError) << endl; glGetIntegerv(GL_RED_BITS, &red); glGetIntegerv(GL_BLUE_BITS, &blue); glGetIntegerv(GL_GREEN_BITS, &green); As I said before, even if I haven't run a single opengl command, I get an infinite amount of errors, and if I remove the error string, the glGetIntegerv call doesn't work (even though it works with the enum GL_DRAW_BUFFER). I'm wondering if this is a windows related problem, because I've compiled it using the WINAPI's version of openGL built using visual studio and cygwin's version using gcc (currently using Windows XP SP2 Professional, the latest Cygwin). If anyone can help that'd be great.
Advertisement
You cannot call ANY OpenGL function until the rendering context is created. You must create it first, and only then can you make sense of any calls to OpenGL. So glGetError returning an error even before the window is created makes perfect sense.

This topic is closed to new replies.

Advertisement