Archived

This topic is now archived and is closed to further replies.

31337

GLEnum - WTFrick?

Recommended Posts

31337    100
I have a question concerning GLEnum''s. What the frick is a GLEnum? I know when your using GL its a type, like float, double, or int. However, what I don''t understand is how this is represented at the binary level. I thought - well maybe its just an unsigned int and then the sizeof that type. Is this correct? The reason why I''m asking is that I''m trying to put a virtual class layer inbetween the user and the engine, that way it can be more rendering api independant. Any help would be greatly appreciated. Thanks!

Share this post


Link to post
Share on other sites
ZealousElixir    256
Yeah, it represents a value from an enumeration, but since "enumerations" in OpenGL are just "#defines", GLenum takes the form

typedef unsigned int GLenum;

[edited by - zealouselixir on August 13, 2003 9:23:14 PM]

Share this post


Link to post
Share on other sites
31337    100
Ok, so its an unsigned int. Does that unsigned int represent the sizeof() the item? I''ve never used enumerations before.

Share this post


Link to post
Share on other sites
TravisWells    276
It''s just an unsigned integer, if you need the size use sizeof(GLenum);

Since OpenGL is based on C, it''s not a "real" enum. Just an int, that is used like one.
So you''re only supposed to use GLenum values like GL_TEXTURE_2D with it, although there is nothing stopping you from just using a number, like 5.
It just won''t work at run time, as OpenGL will think you mean GL_TRIANGLE_STRIP, which is #define''d to be 5 in gl.h

The makers of OpenGL are just trying to emulate enums in a language that doesn''t have them.

Share this post


Link to post
Share on other sites