This topic is 3820 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hey there, I'm slowing working my way through and porting my code from VB to C/C++ - Below is the code I use to load .tga files for textures. I'm not getting any error messages, but I'm not getting the output I want. I'm just getting an empty white cube. I've enabled texturing, bound the texture and set the co-ordinates correctly on the cube, so the problem must lie in this code. Many thanks.
[source lang=cpp]
#include <stdio.h>

GLuint textureName;

{
unsigned char imageIDLength;

unsigned char colourMapType;
unsigned char imageTypeCode;

short int colourMapOrigin;
short int colourMapLength;
short int colourMapEntrySize;
short int imageXOrigin;
short int imageYOrigin;
short int imageWidth;
short int imageHeight;
unsigned char bitCount;
unsigned char imageDescriptor;

typedef struct
{
unsigned char imageTypeCode;
short int imageWidth;
short int imageHeight;
unsigned char bitCount;
unsigned char *imageData;
} TGAFILE;

bool extractTexture(char* path, TGAFILE *tgaFile)
{
FILE			*filePtr;
long			imageSize;
int				colourMode;

filePtr = fopen(path,"rb");
if(!filePtr)
{
writeLog("[Failed]\n",FALSE);
return FALSE;
}

if ((tgaFile->imageTypeCode != 2) && (tgaFile->imageTypeCode != 3))
{
fclose(filePtr);
writeLog("[Failed]\n",FALSE);
return FALSE;
}

colourMode = tgaFile->bitCount /8;
imageSize = tgaFile->imageWidth * tgaFile->imageHeight * colourMode;

tgaFile->imageData = (unsigned char*)malloc(sizeof(unsigned char)*imageSize);

fclose(filePtr);
writeLog("[Done]\n",FALSE);
return TRUE;
}

{
FILE *fp;
if((fp=fopen(path,"r")) ==NULL)
{
writeLog("[Failed]\n",FALSE);
return FALSE;
}
else
{
TGAFILE *myTGA;
myTGA = (TGAFILE*)malloc(sizeof(TGAFILE));
extractTexture(path,myTGA);

glGenTextures(1,&textureName);

char buffer[200];
sprintf(buffer,"Creating Texture(ID:%i)                           ",textureName);
writeLog(buffer,TRUE);
writeLog("[Done]\n",FALSE);

glBindTexture(GL_TEXTURE_2D,textureName);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);

glTexImage2D(GL_TEXTURE_2D,0,GL_BGR,myTGA->imageWidth,myTGA->imageHeight,0,GL_RGB,GL_UNSIGNED_BYTE,myTGA->imageData);

return TRUE;
}
fclose(fp);
}



##### Share on other sites
I haven't looked at the code 100%, but are you loading RGBA textures or RGB?

##### Share on other sites
I'm not 100% sure, if the bug is here, but this isn't 100% correct OpenGL code:
glTexImage2D(GL_TEXTURE_2D,0,GL_BGR,myTGA->imageWidth,myTGA->imageHeight,0,GL_RGB,
GL_UNSIGNED_BYTE,myTGA->imageData);
Third parameter - it has to be not GL_BGR, but 3 (or if you load texture with alpha channel, then it'd be 4 an seventh parameter would be GL_RGBA) - I'm using for TGA loading this
glTexImage2D(GL_TEXTURE_2D,0,3,myTGA->imageWidth,myTGA->imageHeight,0,GL_RGB,GL_UNSIGNED_BYTE,myTGA->imageData);

##### Share on other sites
if((myTGA->bitCount/8)==3)
{
glTexImage2D(GL_TEXTURE_2D,0,myTGA->bitCount/8,myTGA->imageWidth,myTGA->imageHeight,0,GL_BGR,GL_UNSIGNED_BYTE,myTGA->imageData);
} else {
glTexImage2D(GL_TEXTURE_2D,0,myTGA->bitCount/8,myTGA->imageWidth,myTGA->imageHeight,0,GL_BGRA,GL_UNSIGNED_BYTE,myTGA->imageData);
}

Ok, I replaced the line in the code with the ones above. I'm making some progress, I can sort of make out the image, but each row seems to be offset by a pixel to the row above and the image is black and white.

I've tried replacing GL_BGR & GL_BGRA with GL_RGB & GL_RGBA respectively, but I'm sure TGA data is stored in BGR format.

Any further suggestions?

##### Share on other sites
Just to clear up some misinformation about the third parameter. Passing numbers is the old way of doing it, where the number represents the number of color components. This is not the same as bitcount/8. In common cases, 24/8 (a 24 bit image) is 3, which conicides with the number of color componenets, being 3 for RGB.

This is, however, a conceptual error and the wrong way to determine the parameter. What if the RGB image has 16 bits per color component? 48/3 is 6, and your code fails, becuse the image doesn't have 6 color channels.

In "modern" OpenGL, with modern referring to OpenGL 1.1 or so, the parameter is the explicit format you want. GL_RGB for three-channel images, GL_RGBA for four-channel images and so on. Note that GL_BGR is not a valid format, as the purpose of the internal format is only to determine the number of color components, and what components are present. So for a 3-channel RGB-image, GL_RGB is the proper parameter. The third and second last parameter is repsonsible to decode the physical memory layout of the source data. For example, GL_BGR and GL_UNSIGNED_BYTE if the source image is BGR with 8 bit components.

##### Share on other sites
Hi there guys,

Well I've gotten it pretty much working, thanks to your help. I was being stupid and had the texture being 33x33 which would be the cause of the misformatting of the image and also the massive slowdown I also encountered(with it being NPOT).

However, although for the most part it is now working, the colour is still slightly wrong. Instead of showing a green logo it is displaying a blue logo. I can't seem to find what the problem is.

##### Share on other sites

unsigned char temp;
for all pixels:
temp = data[pixel].red
data[pixel].red = data[pixel].blue;
data[pixel].blue = temp;

Cheers!

/Robert

##### Share on other sites
As Brother Bob pointed out already GL_BGR_EXT is ok as a format parameter, not an internal format parameter.

So if you are loading TGA files the data should be stored as BGRA so you can use GL_BGRA_EXT if your Gfx supports it.

glTexImage2D(GL_TEXTURE_2D,0,GL_RGB,myTGA->imageWidth,myTGA->imageHeight,0,GL_BGR_EXT,GL_UNSIGNED_BYTE,myTGA->imageData);

Try that

##### Share on other sites
Quote:
 Original post by MARS_999As Brother Bob pointed out already GL_BGR_EXT is ok as a format parameter, not an internal format parameter. So if you are loading TGA files the data should be stored as BGRA so you can use GL_BGRA_EXT if your Gfx supports it.

Since I've already mentioned "modern" OpenGL, I might aswell do it again. BGR formats was moved to the core in 1998 with OpenGL 1.2. Why still use the extension name of it?

##### Share on other sites
Quote:
 Why still use the extension name of it?

'coz it works maybe.

Some of us are lazy and can't be bothered going thru the little pain
of finding/making the right/descent up-to-date header file as a replacement.

• 18
• 11
• 17
• 9
• 51
• ### Forum Statistics

• Total Topics
631397
• Total Posts
2999793
×