GLEnum - WTFrick?

Started by
4 comments, last by 31337 20 years, 8 months ago
I have a question concerning GLEnum''s. What the frick is a GLEnum? I know when your using GL its a type, like float, double, or int. However, what I don''t understand is how this is represented at the binary level. I thought - well maybe its just an unsigned int and then the sizeof that type. Is this correct? The reason why I''m asking is that I''m trying to put a virtual class layer inbetween the user and the engine, that way it can be more rendering api independant. Any help would be greatly appreciated. Thanks!
OpenGL Revolutions http://students.hightechhigh.org/~jjensen/
Advertisement
It is an enumeration in OpenGL... Look at it in the headers
"...."
Yeah, it represents a value from an enumeration, but since "enumerations" in OpenGL are just "#defines", GLenum takes the form

typedef unsigned int GLenum;

[edited by - zealouselixir on August 13, 2003 9:23:14 PM]

[twitter]warrenm[/twitter]

Ok, so its an unsigned int. Does that unsigned int represent the sizeof() the item? I''ve never used enumerations before.
OpenGL Revolutions http://students.hightechhigh.org/~jjensen/
Although I have used the enum keyword, but I don''t think thats the same
OpenGL Revolutions http://students.hightechhigh.org/~jjensen/
It''s just an unsigned integer, if you need the size use sizeof(GLenum);

Since OpenGL is based on C, it''s not a "real" enum. Just an int, that is used like one.
So you''re only supposed to use GLenum values like GL_TEXTURE_2D with it, although there is nothing stopping you from just using a number, like 5.
It just won''t work at run time, as OpenGL will think you mean GL_TRIANGLE_STRIP, which is #define''d to be 5 in gl.h

The makers of OpenGL are just trying to emulate enums in a language that doesn''t have them.

This topic is closed to new replies.

Advertisement