Sign in to follow this  

A strange error when running my program using OpenGL v1.1

Recommended Posts

Hi peeps I have written a picture viewer program that (funnily enough) loads pictures into a window. Anyway I added OpenGL error checking code to the main RenderScene() function to catch if any errors occur when I use the DrawPixels() function (this is the only OpenGL function I really use for this program). Anyway the program runs fine on my machine however if I try running it on a machine with the generic Microsoft v1.1 implementation of OpenGL then I get some sort of OpenGL "enum" error. Anyone encountered this before. I think the program works fine if I remove the OpenGL error. Maybe. Any ideas?? Steve B

Share this post

Link to post
Share on other sites
Original post by AndyL
Perhaps you've specified an invalid datatype or an invalid pixel format?

Hi Andy

Hmmm. I'm really not sure. glDrawPixels is the only OpenGL function I use in my main drawing function. Here is what it looks like:

glDrawPixels(width, height, GL_RGB, GL_UNSIGNED_BYTE, deepILBM);

width = unsigned int
height = unsigned int
deepILBM = *unsigned char

I think this is the error that was generated:

GL_INVALID_ENUM is generated if format or type is not one of the accepted values.
GL_INVALID_ENUM is generated if type is GL_BITMAP and format is not either GL_COLOR_INDEX or GL_STENCIL_INDEX.

I can't really see what the problem is though. I'm also quite sure that it works fine (not tested it for a while though) when the error checking is removed. Maybe I should use a C++ cast to force width and height to become GLsizei? I'm not really sure.


Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this