• Advertisement

Archived

This topic is now archived and is closed to further replies.

What is wrong with GL_BGR pixel format?

This topic is 5533 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am trying to call this: glDrawPixels(bih.biWidth, bih.biHeight, GL_BGR, GL_UNSIGNED_BYTE, bitmap); GL_BGR is one of the accepted pixel formats correct? It lists that in OpenGL Game Programming on page 212

Share this post


Link to post
Share on other sites
Advertisement
and what happens does it crash?

make sure your bmp you are drawing has the same pixelformat and not RGB which is standard

Share this post


Link to post
Share on other sites
Yes. GL_BGR was added in OpenGL 1.2, if I''m not mistaken. Make sure your implementation of OpenGL is 1.2 or newer (GL_VERSION_1_2 in gl.h is defined if it is), that you are including glext.h (in which case you''ll have to use GL_BGR_EXT), or that you define it yourself (GL_BGR is 0x80E0).

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
I don''t see why you should have to use GL_BGR_EXT. GL_BGR itself is defined by glext.h. Also, it''s not enough to check for GL_VERSION_1_2. All that tells you is that the headers you''re using declare all OpenGL 1.2 macros and prototypes, but it tells you nothing about what version of OpenGL is actually running on your system. To get the OpenGL version call glGetString(GL_VERSION).

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by Shadow1234567890
I take
#define GL_VERSION 0x1F02
to equal GL_VERSION 1_2?



No. The macro is GL_VERSION_1_2 and you use it like this:

#ifdef GL_VERSION_1_2
//do whatever
#endif

You don''t need to do that unless you want your program to compile on systems that don''t have OpenGL 1.2 prototypes.

Share this post


Link to post
Share on other sites
I have that book, but I haven''t read it all.. anyways I''m at work and I can''t look it up, so I''m wondering:

What is the benefit of using the color format backwards?


"aut viam inveniam aut faciam" - I will either find a way or make one.

MoonStar Projects

Share this post


Link to post
Share on other sites
I normally do this in my code:

#ifndef GL_BGR
#define GL_BGR GL_BGR_EXT
#endif

That way, systems with GL_BGR (I know my version of linux does) are fine, and for windows (which has GL_BGR_EXT but not GL_BGR) it gets defined correctly.
Of course, this would blow up in your face if you tried to compile on a system without either GL_BGR or GL_BGR_EXT but I haven''t run across one yet.

Share this post


Link to post
Share on other sites
quote:
Original post by Ronin Magus
What is the benefit of using the color format backwards?



I don''t know about drawing, but the BGR format is useful for texture loading (BMP''s are in BGR format, this way you don''t have to convert, you just let the hardware/driver do it).

I''m assuming he''s drawing directly from some .BMP data, so with BGR he doesn''t have convert it.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Extract from glext.h:

#ifndef GL_VERSION_1_2
...
#define GL_BGR 0x80E0
...

So, when you include glext.h GL_BGR is automatically defined for you if necessary.

Download glext.h from http://oss.sgi.com/projects/ogl-sample/registry/.

Share this post


Link to post
Share on other sites

  • Advertisement