GLEnum - WTFrick?
I have a question concerning GLEnum''s. What the frick is a GLEnum? I know when your using GL its a type, like float, double, or int. However, what I don''t understand is how this is represented at the binary level. I thought - well maybe its just an unsigned int and then the sizeof that type. Is this correct?
The reason why I''m asking is that I''m trying to put a virtual class layer inbetween the user and the engine, that way it can be more rendering api independant.
Any help would be greatly appreciated. Thanks!
Yeah, it represents a value from an enumeration, but since "enumerations" in OpenGL are just "#defines", GLenum takes the form
typedef unsigned int GLenum;
[edited by - zealouselixir on August 13, 2003 9:23:14 PM]
typedef unsigned int GLenum;
[edited by - zealouselixir on August 13, 2003 9:23:14 PM]
Ok, so its an unsigned int. Does that unsigned int represent the sizeof() the item? I''ve never used enumerations before.
It''s just an unsigned integer, if you need the size use sizeof(GLenum);
Since OpenGL is based on C, it''s not a "real" enum. Just an int, that is used like one.
So you''re only supposed to use GLenum values like GL_TEXTURE_2D with it, although there is nothing stopping you from just using a number, like 5.
It just won''t work at run time, as OpenGL will think you mean GL_TRIANGLE_STRIP, which is #define''d to be 5 in gl.h
The makers of OpenGL are just trying to emulate enums in a language that doesn''t have them.
Since OpenGL is based on C, it''s not a "real" enum. Just an int, that is used like one.
So you''re only supposed to use GLenum values like GL_TEXTURE_2D with it, although there is nothing stopping you from just using a number, like 5.
It just won''t work at run time, as OpenGL will think you mean GL_TRIANGLE_STRIP, which is #define''d to be 5 in gl.h
The makers of OpenGL are just trying to emulate enums in a language that doesn''t have them.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement