Sign in to follow this  

OpenGL Texture Colors Are Wrong

Recommended Posts

DrStroodle    122
ok, first id like to say im new to openGL, but ive caught on quickly. Im using a texture manager (works perfectly), created a tilemap, and a rigid body physics engine. so far so good... or so i thought. im working in 2D, ortho mode, using SDL + OpenGL. As of recently (ive been up for 3 days straight coding at this point) Textures no longer display witht eh right colors. They all contain a blue tint. it seems as though it was drawn with glColor3f(0.0,0.0,1.0), but it isnt. In fact, to make sure, before every call to render a textured quad, i put glColor3f(1.0,1.0,1.0). It still shows up blue. ive reduced my code to minimal code needed to load a texture and render a textured quad, and its still blue. the only guess i have is possibly something wrong with my file loading routines, which wouldnt make sense since the colors were working previously. anyways, heres my code and maybe you all can see something i dont: //init openGL through SDL
	switch(mPixelBits) {
		case 16:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 5 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 6 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 5 );
		case 24:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
		case 32:
			SDL_GL_SetAttribute( SDL_GL_RED_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_GREEN_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_BLUE_SIZE, 8 );
			SDL_GL_SetAttribute( SDL_GL_ALPHA_SIZE, 8 );
	SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, mPixelBits);//16 bit depth
	SDL_GL_SetAttribute(SDL_GL_BUFFER_SIZE, mPixelBits);
	SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);//enable double buffering

and here is my OGL init code:

	glMatrixMode(GL_PROJECTION);						// Select The Projection Matrix
	glLoadIdentity();									// Reset The Projection Matrix

	glMatrixMode(GL_MODELVIEW);							// Select The Modelview Matrix
	glLoadIdentity();									// Reset The Modelview Matrix
	glShadeModel(GL_SMOOTH);							// Enable Smooth Shading
	glClearColor(0.0f, 0.0f, 0.0f, 0.5f);				// Black Background
	glClearDepth(1.0f);									// Depth Buffer Setup
	glEnable(GL_DEPTH_TEST);							// Enables Depth Testing
	glDepthFunc(GL_LEQUAL);								// The Type Of Depth Testing To Do
	glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);	// Really Nice Perspective Calculations

//and here is my loading a bitmap & converting to openGL texture function:
	int loadFromFile(std::string &fileName)
		int status = false;
		SDL_Surface *textureImage; 
		if( textureImage=SDL_LoadBMP(fileName.c_str()) )
			status = true;
			GLuint *glText=new GLuint();
			glGenTextures( 1,glText);

			/* Typical Texture Generation Using Data From The Bitmap */
			glBindTexture( GL_TEXTURE_2D,(*glText));

			int mode=GL_RGB;
			if (textureImage->format->BytesPerPixel == 3) { // RGB 24bit
				mode = GL_RGB;
			} else if (textureImage->format->BytesPerPixel == 4) { // RGBA 32bit
                mode = GL_RGBA;

			/* Generate The Texture */
			glTexImage2D( GL_TEXTURE_2D, 0, 3, textureImage->w,
				textureImage->h, 0, //GL_BGR,
				GL_UNSIGNED_BYTE, textureImage->pixels );

			/* Linear Filtering */

			//make our data pointer point to the proper thing

		/* Free up any memory we may have used */
		if (textureImage)
		return status;

here is how i draw the quads
	for(int t=0;t<3;t++)
		if(tile.tileID[t]==-1)//no tile

		GLuint tx=tile.tileTex[t];

		// top left
		glVertex3f(0, 0, 0);
		// top right
		glVertex3f(mTileWidth, 0, 0);
		// bottom right
		glVertex3f(mTileWidth, mTileHeight, 0);
		// bottom left
		glVertex3f(0, mTileHeight, 0);

im completly lost here... All the textures come out funny, but the primitives i draw on top of them (Physics simulation, wireframe rigid bodies) colors are perfectly fine. I can only assume i either did something _REALLY_ weird to openGL, or that its loading my images improperly. BTW: image format: BMP, 24-bits, no alpha; and i have tried a multitude of pixel formats, all with the same result. thanks in advance, i appreciate any help i can get (its driving me nuts). ANd i apologize for any bad typing/rambling/idiocy, i havent slept in several days. happy coding all [Edited by - _the_phantom_ on July 17, 2005 2:34:38 PM]

Share this post

Link to post
Share on other sites
_the_phantom_    11250

glTexImage2D( GL_TEXTURE_2D, 0, 3, textureImage->w,
textureImage->h, 0, //GL_BGR,
GL_UNSIGNED_BYTE, textureImage->pixels );

Thats where your problem is.

Point one; you dont have to use '3' as the 3rd param, you can use GL_RGB.
Point two, 'mode', you never set it to BGR higher in your code, which is the colour format SDL uses.

Point three: use 'source' tags for long bits of code.
I've fixed your post this time.

Share this post

Link to post
Share on other sites
Koshmaar    989
Hmmmmm, I looked at your code and found no critical errors (that is: bugs which would result in bluish textures). You've got here and there little issues (ie. glOrtho(0.0f,mScreenWidth-1,0.0,mScreenHeight-1,1.0f,-1.0f); <--- there's no need to subtract 1 from mScreenWidth if it's equal to 1024, 800, 640 etc. it should be the same what you pass to glViewport() ), or things which could have been written better (ie. creation of GLuint <--- why use dynamic memory allocation?), but AFAIK nothing serious.

Btw, are you sure you're using proper versions of SDL libs? ie. headers from 1.28, "static" lib from 1.28, dll from 1.28 etc.

Anyway, here's code from my texture manager, which is working nicely with all sorts of .pngs: 32, 24, 16, 8 bit, power of two, not power of two etc. Hope you'll be able to get from it sth usefull:

// it's doing the same as SDL_LoadBMP
SDL_Surface * tmp = IMG_Load(filename.c_str());

if (tmp == 0) return false;

// first check whether it is needed to expand texture size to powers of 2

usint w = tmp->w;
usint h = tmp->h;

usint physicalW = tmp->w;
usint physicalH = tmp->h;

SDL_Surface * image = tmp; // create alias

bool IsWidth = IsPowerOfTwo(w);
bool IsHeight = IsPowerOfTwo(h);

// -------------------------------------------------

if ((!IsWidth) || (!IsHeight) || (tmp->format->BitsPerPixel != 32))
// we need to create new SDL_Surface with power of two dimensions and 32 bit color
if ((!IsWidth) || (!IsHeight)) debug("This texture needs to be resized during startup.")
if (tmp->format->BitsPerPixel != 32) debug("This texture needs to have altered color depth.")
debug("In release versions, please remember to fix it so startups will be much shorter.")

if (!IsWidth) w = PowerOfTwo(w);
if (!IsHeight) h = PowerOfTwo(h);

// -------------------------------------------------

Uint32 rmask, gmask, bmask, amask;

SDL interprets each pixel as a 32-bit number, so our masks must depend on the endianness
(byte order) of the machine

rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;

// -------------------------------------------------

// now image is no longer alias to tmp, rather that new surface
image = SDL_CreateRGBSurface(SDL_SWSURFACE , w, h, 32, rmask, gmask, bmask, amask);

if ( image == 0 )
logError2("TextureMgr", "Couldn't allocate memory for expanding texture to power of 2")
SC_ASSERT(!"Couldn't allocate memory for expanding texture to power of 2")
return false;

SDL_Rect area;
area.x = 0;
area.y = 0;
area.w = tmp->w;
area.h = tmp->h;

// it was fucking stupid bug with 32 bpp textures that needed to be resized
if ( tmp->format->BitsPerPixel == 32)
SDL_SetAlpha(tmp, 0, 0);

// copy the tmp into the new GL texture image
SDL_BlitSurface(tmp, &area, image, &area);
tmp = 0;

// -------------------------------------------------

TextureHandle textureHwnd;

glGenTextures(1, &;



// we're always creating RGBA textures
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, w, h, 0, GL_RGBA, GL_UNSIGNED_BYTE, image->pixels);

textureHwnd.width = w;
textureHwnd.height = h;
textureHwnd.physicalWidth = physicalW;
textureHwnd.physicalHeight = physicalH;
textureHwnd.refCount = 1;

if (tmp) SDL_FreeSurface(tmp);
else SDL_FreeSurface(image);

// now add it to map
(_itor->second)->insert(TextureGroupPair(_textureName, textureHwnd));

return true;

Btw, if you want to paste code, do it inside source tags (write source inside brackets).

Share this post

Link to post
Share on other sites
DrStroodle    122
Thanks for fixing my post, i used < code > < /code > brackets, sorry about that. I'll be sure to keep taht in mind next time.

and thanks for pointing out my problem,i really had no idea it was that function call messing things up for me. That was the code i had the most faith in, since it was a direct copy off of the Nehe Tutorials.

well, thanks :)

koshmaar: thanks, ill take a look at your code and see what i can change in my routins for my manager. I know some of it needs cleaning, a bit of that code is taken from tutorials, and i havent gotton around to changing them up yet since they seemed to work for the most part. :)

GL_RBG isnt defined anywhere in my openGL libraries, where am i supposed to find it?

Share this post

Link to post
Share on other sites
DrStroodle    122
Thanks for everyone who helped me ! :) :) colors are working again.
Gotta love microsoft for not offering more updates to openGL.

thanks for pointing me to those files, _phantom_. and thanks for the code koshmaar, i think im going to use the size conversion idea of yours.

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By povilaslt2
      Hello. I'm Programmer who is in search of 2D game project who preferably uses OpenGL and C++. You can see my projects in GitHub. Project genre doesn't matter (except MMO's :D).
    • By ZeldaFan555
      Hello, My name is Matt. I am a programmer. I mostly use Java, but can use C++ and various other languages. I'm looking for someone to partner up with for random projects, preferably using OpenGL, though I'd be open to just about anything. If you're interested you can contact me on Skype or on here, thank you!
      Skype: Mangodoor408
    • By tyhender
      Hello, my name is Mark. I'm hobby programmer. 
      So recently,I thought that it's good idea to find people to create a full 3D engine. I'm looking for people experienced in scripting 3D shaders and implementing physics into engine(game)(we are going to use the React physics engine). 
      And,ye,no money =D I'm just looking for hobbyists that will be proud of their work. If engine(or game) will have financial succes,well,then maybe =D
      Sorry for late replies.
      I mostly give more information when people PM me,but this post is REALLY short,even for me =D
      So here's few more points:
      Engine will use openGL and SDL for graphics. It will use React3D physics library for physics simulation. Engine(most probably,atleast for the first part) won't have graphical fron-end,it will be a framework . I think final engine should be enough to set up an FPS in a couple of minutes. A bit about my self:
      I've been programming for 7 years total. I learned very slowly it as "secondary interesting thing" for like 3 years, but then began to script more seriously.  My primary language is C++,which we are going to use for the engine. Yes,I did 3D graphics with physics simulation before. No, my portfolio isn't very impressive. I'm working on that No,I wasn't employed officially. If anybody need to know more PM me. 
    • By Zaphyk
      I am developing my engine using the OpenGL 3.3 compatibility profile. It runs as expected on my NVIDIA card and on my Intel Card however when I tried it on an AMD setup it ran 3 times worse than on the other setups. Could this be a AMD driver thing or is this probably a problem with my OGL code? Could a different code standard create such bad performance?
    • By Kjell Andersson
      I'm trying to get some legacy OpenGL code to run with a shader pipeline,
      The legacy code uses glVertexPointer(), glColorPointer(), glNormalPointer() and glTexCoordPointer() to supply the vertex information.
      I know that it should be using setVertexAttribPointer() etc to clearly define the layout but that is not an option right now since the legacy code can't be modified to that extent.
      I've got a version 330 vertex shader to somewhat work:
      #version 330 uniform mat4 osg_ModelViewProjectionMatrix; uniform mat4 osg_ModelViewMatrix; layout(location = 0) in vec4 Vertex; layout(location = 2) in vec4 Normal; // Velocity layout(location = 3) in vec3 TexCoord; // TODO: is this the right layout location? out VertexData { vec4 color; vec3 velocity; float size; } VertexOut; void main(void) { vec4 p0 = Vertex; vec4 p1 = Vertex + vec4(Normal.x, Normal.y, Normal.z, 0.0f); vec3 velocity = (osg_ModelViewProjectionMatrix * p1 - osg_ModelViewProjectionMatrix * p0).xyz; VertexOut.velocity = velocity; VertexOut.size = TexCoord.y; gl_Position = osg_ModelViewMatrix * Vertex; } What works is the Vertex and Normal information that the legacy C++ OpenGL code seem to provide in layout location 0 and 2. This is fine.
      What I'm not getting to work is the TexCoord information that is supplied by a glTexCoordPointer() call in C++.
      What layout location is the old standard pipeline using for glTexCoordPointer()? Or is this undefined?
      Side note: I'm trying to get an OpenSceneGraph 3.4.0 particle system to use custom vertex, geometry and fragment shaders for rendering the particles.
  • Popular Now