Advertisement Jump to content
Sign in to follow this  

OpenGL Texture Colors Are Wrong

This topic is 4931 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

ok, first id like to say im new to openGL, but ive caught on quickly. Im using a texture manager (works perfectly), created a tilemap, and a rigid body physics engine. so far so good... or so i thought. im working in 2D, ortho mode, using SDL + OpenGL. As of recently (ive been up for 3 days straight coding at this point) Textures no longer display witht eh right colors. They all contain a blue tint. it seems as though it was drawn with glColor3f(0.0,0.0,1.0), but it isnt. In fact, to make sure, before every call to render a textured quad, i put glColor3f(1.0,1.0,1.0). It still shows up blue. ive reduced my code to minimal code needed to load a texture and render a textured quad, and its still blue. the only guess i have is possibly something wrong with my file loading routines, which wouldnt make sense since the colors were working previously. anyways, heres my code and maybe you all can see something i dont: //init openGL through SDL
	switch(mPixelBits) {
		case 16:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 6 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
		case 24:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
		case 32:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
	SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, mPixelBits);//16 bit depth
	SDL_GL_SetAttribute(SDL_GL_BUFFER_SIZE, mPixelBits);
	SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);//enable double buffering

and here is my OGL init code:

	glMatrixMode(GL_PROJECTION);						// Select The Projection Matrix
	glLoadIdentity();									// Reset The Projection Matrix

	glMatrixMode(GL_MODELVIEW);							// Select The Modelview Matrix
	glLoadIdentity();									// Reset The Modelview Matrix
	glShadeModel(GL_SMOOTH);							// Enable Smooth Shading
	glClearColor(0.0f, 0.0f, 0.0f, 0.5f);				// Black Background
	glClearDepth(1.0f);									// Depth Buffer Setup
	glEnable(GL_DEPTH_TEST);							// Enables Depth Testing
	glDepthFunc(GL_LEQUAL);								// The Type Of Depth Testing To Do
	glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);	// Really Nice Perspective Calculations

//and here is my loading a bitmap & converting to openGL texture function:
	int loadFromFile(std::string &fileName)
		int status = false;
		SDL_Surface *textureImage; 
		if( textureImage=SDL_LoadBMP(fileName.c_str()) )
			status = true;
			GLuint *glText=new GLuint();
			glGenTextures( 1,glText);

			/* Typical Texture Generation Using Data From The Bitmap */
			glBindTexture( GL_TEXTURE_2D,(*glText));

			int mode=GL_RGB;
			if (textureImage->format->BytesPerPixel == 3) { // RGB 24bit
				mode = GL_RGB;
			} else if (textureImage->format->BytesPerPixel == 4) { // RGBA 32bit
                mode = GL_RGBA;

			/* Generate The Texture */
			glTexImage2D( GL_TEXTURE_2D, 0, 3, textureImage->w,
				textureImage->h, 0, //GL_BGR,
				GL_UNSIGNED_BYTE, textureImage->pixels );

			/* Linear Filtering */

			//make our data pointer point to the proper thing

		/* Free up any memory we may have used */
		if (textureImage)
		return status;

here is how i draw the quads
	for(int t=0;t<3;t++)
		if(tile.tileID[t]==-1)//no tile

		GLuint tx=tile.tileTex[t];

		// top left
		glVertex3f(0, 0, 0);
		// top right
		glVertex3f(mTileWidth, 0, 0);
		// bottom right
		glVertex3f(mTileWidth, mTileHeight, 0);
		// bottom left
		glVertex3f(0, mTileHeight, 0);

im completly lost here... All the textures come out funny, but the primitives i draw on top of them (Physics simulation, wireframe rigid bodies) colors are perfectly fine. I can only assume i either did something _REALLY_ weird to openGL, or that its loading my images improperly. BTW: image format: BMP, 24-bits, no alpha; and i have tried a multitude of pixel formats, all with the same result. thanks in advance, i appreciate any help i can get (its driving me nuts). ANd i apologize for any bad typing/rambling/idiocy, i havent slept in several days. happy coding all [Edited by - _the_phantom_ on July 17, 2005 2:34:38 PM]

Share this post

Link to post
Share on other sites

glTexImage2D( GL_TEXTURE_2D, 0, 3, textureImage->w,
textureImage->h, 0, //GL_BGR,
GL_UNSIGNED_BYTE, textureImage->pixels );

Thats where your problem is.

Point one; you dont have to use '3' as the 3rd param, you can use GL_RGB.
Point two, 'mode', you never set it to BGR higher in your code, which is the colour format SDL uses.

Point three: use 'source' tags for long bits of code.
I've fixed your post this time.

Share this post

Link to post
Share on other sites
Hmmmmm, I looked at your code and found no critical errors (that is: bugs which would result in bluish textures). You've got here and there little issues (ie. glOrtho(0.0f,mScreenWidth-1,0.0,mScreenHeight-1,1.0f,-1.0f); <--- there's no need to subtract 1 from mScreenWidth if it's equal to 1024, 800, 640 etc. it should be the same what you pass to glViewport() ), or things which could have been written better (ie. creation of GLuint <--- why use dynamic memory allocation?), but AFAIK nothing serious.

Btw, are you sure you're using proper versions of SDL libs? ie. headers from 1.28, "static" lib from 1.28, dll from 1.28 etc.

Anyway, here's code from my texture manager, which is working nicely with all sorts of .pngs: 32, 24, 16, 8 bit, power of two, not power of two etc. Hope you'll be able to get from it sth usefull:

// it's doing the same as SDL_LoadBMP
SDL_Surface * tmp = IMG_Load(filename.c_str());

if (tmp == 0) return false;

// first check whether it is needed to expand texture size to powers of 2

usint w = tmp->w;
usint h = tmp->h;

usint physicalW = tmp->w;
usint physicalH = tmp->h;

SDL_Surface * image = tmp; // create alias

bool IsWidth = IsPowerOfTwo(w);
bool IsHeight = IsPowerOfTwo(h);

// -------------------------------------------------

if ((!IsWidth) || (!IsHeight) || (tmp->format->BitsPerPixel != 32))
// we need to create new SDL_Surface with power of two dimensions and 32 bit color
if ((!IsWidth) || (!IsHeight)) debug("This texture needs to be resized during startup.")
if (tmp->format->BitsPerPixel != 32) debug("This texture needs to have altered color depth.")
debug("In release versions, please remember to fix it so startups will be much shorter.")

if (!IsWidth) w = PowerOfTwo(w);
if (!IsHeight) h = PowerOfTwo(h);

// -------------------------------------------------

Uint32 rmask, gmask, bmask, amask;

SDL interprets each pixel as a 32-bit number, so our masks must depend on the endianness
(byte order) of the machine

rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;

// -------------------------------------------------

// now image is no longer alias to tmp, rather that new surface
image = SDL_CreateRGBSurface(SDL_SWSURFACE , w, h, 32, rmask, gmask, bmask, amask);

if ( image == 0 )
logError2("TextureMgr", "Couldn't allocate memory for expanding texture to power of 2")
SC_ASSERT(!"Couldn't allocate memory for expanding texture to power of 2")
return false;

SDL_Rect area;
area.x = 0;
area.y = 0;
area.w = tmp->w;
area.h = tmp->h;

// it was fucking stupid bug with 32 bpp textures that needed to be resized
if ( tmp->format->BitsPerPixel == 32)
SDL_SetAlpha(tmp, 0, 0);

// copy the tmp into the new GL texture image
SDL_BlitSurface(tmp, &area, image, &area);
tmp = 0;

// -------------------------------------------------

TextureHandle textureHwnd;

glGenTextures(1, &;



// we're always creating RGBA textures
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);

textureHwnd.width = w;
textureHwnd.height = h;
textureHwnd.physicalWidth = physicalW;
textureHwnd.physicalHeight = physicalH;
textureHwnd.refCount = 1;

if (tmp) SDL_FreeSurface(tmp);
else SDL_FreeSurface(image);

// now add it to map
(_itor->second)->insert(TextureGroupPair(_textureName, textureHwnd));

return true;

Btw, if you want to paste code, do it inside source tags (write source inside brackets).

Share this post

Link to post
Share on other sites
Thanks for fixing my post, i used < code > < /code > brackets, sorry about that. I'll be sure to keep taht in mind next time.

and thanks for pointing out my problem,i really had no idea it was that function call messing things up for me. That was the code i had the most faith in, since it was a direct copy off of the Nehe Tutorials.

well, thanks :)

koshmaar: thanks, ill take a look at your code and see what i can change in my routins for my manager. I know some of it needs cleaning, a bit of that code is taken from tutorials, and i havent gotton around to changing them up yet since they seemed to work for the most part. :)

GL_RBG isnt defined anywhere in my openGL libraries, where am i supposed to find it?

Share this post

Link to post
Share on other sites
You'll need a copy of glext.h, it its still doesnt work you'll still need glext.h and you'll have to use GL_BGR_EXT instead
(GLee let you use GL_BGR directly and is to be prefered for extension stuff)

Share this post

Link to post
Share on other sites
Thanks for everyone who helped me ! :) :) colors are working again.
Gotta love microsoft for not offering more updates to openGL.

thanks for pointing me to those files, _phantom_. and thanks for the code koshmaar, i think im going to use the size conversion idea of yours.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!