Inversion of R-B pixel values

Started by
6 comments, last by LucidIon 19 years, 2 months ago
I have written this game (pong clone) as a way to learn openGL+SDL. I started developmetn on Linux and am working on a Windows port. Everything works great in Linux, but when I run the windows version, some weird stuff happens: First, I have to change this call so that it will compile: (linux version) glTexImage2D( GL_TEXTURE_2D, 0, 3, TextureImage[3]->w,TextureImage[3]->h, 0,GL_BGR,GL_UNSIGNED_BYTE, TextureImage[3]->pixels ); to this: glTexImage2D( GL_TEXTURE_2D, 0, 3, TextureImage[3]->w,TextureImage[3]->h, 0,GL_RGB,GL_UNSIGNED_BYTE, TextureImage[3]->pixels ); Note how GL_BGR has been changed to GL_RGB. However, it now inverts Red and Blue an the textures (blue is now red, and vice versa). So, A)Why won't it accept GL_BGR (complains about undefined reference)? B)How does one correct the color inversion problem?
Advertisement
My gl.h and glext.h headers do define GL_BGR (with a value of 0x80E0). Yours apparently do not.

Upgrade the library & headers, or try
#ifndef GL_BGR#define GL_BGR 0x80E0#endif

so that you get the parameter, admittedly with the value from my implementation, if it hasn't been defined before (try and keep a semblance of portability).
"Debugging is twice as hard as writing the code in the first place. Therefore, if you write the code as cleverly as possible, you are, by definition, not smart enough to debug it." — Brian W. Kernighan
You need to define GL_BGR. It isn't defined by the GL/GL.h header file in windows (i don't believe).

#define GL_BGR 0x80E0

you can get this define and all support for newer OpenGL functions online at Here. There is an article here on Gamedev about this. Article. Is this what you needed?

[EDIT]: I just checked the gl.h that came with VS.net 2003, and it doesn't define GL_BGR, but it defines GL_BGR_EXT as 0x80E0 (same as GL_BGR), which is the same thing, just it must have been an extension when the gl.h was written (i think for OGL 1.1).

Slaru
Your Windows compiler couldn't find GL_BGR in gl.h? What version of OpenGL do you have? In newer versions, GL_BGR is defined, but I think that in older versions, you have to use GL_BGR_EXT. Give that a go.

If you want, you can flip the Red and Blue channels of an image very easily. The following code does this (this code assumes it's a BGR three-channel image, not a BGRA four-channel image):

unsigned char ubTemp;	for(long x = 0; x < lNumPixels * 3; x += 3){		ubTemp = ubYourImage[x];	ubYourImage[x] = ubYourImage[x + 2];	ubYourImage[x + 2] = ubTemp;	}

EDIT: Damn, beaten twice...
My opinion is a recombination and regurgitation of the opinions of those around me. I bring nothing new to the table, and as such, can be safely ignored.[ Useful things - Firefox | GLee | Boost | DevIL ]
I don't know what version of OpenGL I have, but I upgraded to the latest Nvidia drivers as of <1 month ago. Anyway, changing to GL_BGR_EXT fixed it. Will this break portability to newer versions of openGL?

PS- Thanks for the replies, I'm pleasantly surprised with how active forum is! ^_^
Quote:Original post by Mr_Twinkie
I don't know what version of OpenGL I have, but I upgraded to the latest Nvidia drivers as of <1 month ago.
That's fine, but GL_BGR wasn't introduced until OpenGL 1.2, so to use it in your programs, you'll have to define it yourself. Alternatively, you could use the GLee library (created by benjamin bunny, this forum's mod). It has definitions and function prototypes for all extensions and additions up to OpenGL 1.5, including GL_BGR (I checked :). All you'd need to do is #include the GLee.h header file in your project. I think you only need to link to GLee.lib and call GLeeInit() if you're actually going to use extensions.

Quote:Original post by Mr_Twinkie
Anyway, changing to GL_BGR_EXT fixed it. Will this break portability to newer versions of openGL?

No. Chances are that GL_BGR_EXT will be left in for legacy purposes.

Quote:Original post by Mr_Twinkie
PS- Thanks for the replies, I'm pleasantly surprised with how active forum is! ^_^

We aim to please [smile]
My opinion is a recombination and regurgitation of the opinions of those around me. I bring nothing new to the table, and as such, can be safely ignored.[ Useful things - Firefox | GLee | Boost | DevIL ]
To solve the problem just invert the R/B values when generating the texture.
of course you could always do something like

// ensure GL_BGR is defined#ifndef GL_BGR#ifdef GL_BGR_EXT#define GL_BGR GL_BGR_EXT#else#error "GL_BGR is not defined, nor can define GL_BGR from GL_BGR_EXT#endif


Then your code should continue to work, no matter if constants change (perhaps on some platforms?) or if GL_BGR_EXT is removed.

Obviously, now you've got the appropriate colour format, there shouldn't be the inversion problem. You could also use OpenGLs fragment processing to handle it (you can multiply colour values by a matrix). so something like:

  R G B A 1[ 0 0 1 0 0 ] R[ 0 1 0 0 0 ] G[ 1 0 1 0 0 ] B[ 0 0 0 1 0 ] A[ 0 0 0 0 1 ] 1


would swap red and blue (I'm working from shaky memories here though).
-- Jonathan

This topic is closed to new replies.

Advertisement