Archived

This topic is now archived and is closed to further replies.

GLEnum - WTFrick?

This topic is 5235 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have a question concerning GLEnum''s. What the frick is a GLEnum? I know when your using GL its a type, like float, double, or int. However, what I don''t understand is how this is represented at the binary level. I thought - well maybe its just an unsigned int and then the sizeof that type. Is this correct? The reason why I''m asking is that I''m trying to put a virtual class layer inbetween the user and the engine, that way it can be more rendering api independant. Any help would be greatly appreciated. Thanks!

Share this post


Link to post
Share on other sites
Yeah, it represents a value from an enumeration, but since "enumerations" in OpenGL are just "#defines", GLenum takes the form

typedef unsigned int GLenum;

[edited by - zealouselixir on August 13, 2003 9:23:14 PM]

Share this post


Link to post
Share on other sites
It''s just an unsigned integer, if you need the size use sizeof(GLenum);

Since OpenGL is based on C, it''s not a "real" enum. Just an int, that is used like one.
So you''re only supposed to use GLenum values like GL_TEXTURE_2D with it, although there is nothing stopping you from just using a number, like 5.
It just won''t work at run time, as OpenGL will think you mean GL_TRIANGLE_STRIP, which is #define''d to be 5 in gl.h

The makers of OpenGL are just trying to emulate enums in a language that doesn''t have them.

Share this post


Link to post
Share on other sites