Sign in to follow this  
benwestgarth

Some targas work, some don't - texture mapping

Recommended Posts

benwestgarth    122
Hi there, I have a basic sample app which uses targa images to map textures on to quadrics. The app works fine with the targa files which came with the source code, but when I try to apply my own targa files (which I downloaded from the net) the quadrics are rendered without textures. I am loading the targa as data type GL_UNSIGNED_BYTE and pixel format GL_RGB (I have also tried) RGBA. What other type/format might my targas be and how can I find out? Cheers, Ben

Share this post


Link to post
Share on other sites
Boku San    428
Check the size of the TGAs you downloaded. I forget the acceptable sizes, and there's actually a command that scales them to acceptable OGL sizes, but I have to go to bed right now (read: catch up reading Dune)...you should try and flip through the Blue Book online ( here) to find it.

If you want a quick and dirty way, just scale the TGAs you have right now down to the same size as those in the (I'll assume this is NeHe, FYI) source. An incorrect size will simply refuse to load the image file as a texture, you know.

Night.

Share this post


Link to post
Share on other sites
rick_appleton    864
First of all, for standard OpenGL, textures should have power-of-two resolutions. So width and height both need to be on of the following: 1,2,4,8,16,32,64,128,....

Secondly: Targa files can be encoded somewhat, and maybe your loading code isn't counting on that. Here is a tutorial which shows you the difference.

Share this post


Link to post
Share on other sites
skow    248
Make sure it passes the 2^n test, you are not compressing and you are using 32 bit if you are doing RGBA and 24 bit if you are doing RGB.

Share this post


Link to post
Share on other sites
James Trotter    432
Quote:
Original post by rick_appleton
First of all, for standard OpenGL, textures should have power-of-two resolutions.


Actually, I think that the OpenGL 2.0 specification has done away with this totally... But since we don't have an implementation yet, it doesn't matter much... *sigh*..

Share this post


Link to post
Share on other sites
zedzeek    529
there are 2 types of targa's uncompressed and those that use RLE compression, are handling both cases in your loading code.also with some targas the origin is at bottomleft and others is at topleft (though u should see the image only upsidedown)

Share this post


Link to post
Share on other sites
extralongpants    704
I am having some trouble with the loading of targa files too. Sometimes they show up after I load them, sometimes a portion of the texture shows up, and sometimes nothing shows up. I stepped through my targa loading code and it seems that when the targa files dont work, I get some very strange values -the number of channels is 17 and the width is some number in the 4000's or something. Sometimes they work, sometimes they don't, and I haven't been able to figure out why. I gave up on it a while ago. I think it may have something to do with the way they are saved. Each targa file either works or doesn't work, it is never a random occurrence. And the problem seems to go away (in most cases) when I save the file to 16 bits (although many files in 32 and 24 bits have loaded perfectly).

Here is my TGA loading code:


err_const load_from_tga(const std::string &file_name, image &i)
{
ui16 width, height;
ui8 len, img_type, bits;
i32 stride, idx, temp;

serializer s;


i32 err =s.open_file_for_reading(file_name.c_str());
if (err !=ERR_OK)
return ERR_FILE_NOT_FOUND;

s.read(len);

s.in_stream().seekg( 1, std::ios::cur );
s.read( img_type );

s.in_stream().seekg( 9, std::ios::cur );

s.read( width );
s.read( height );
s.read( bits );

s.in_stream().seekg( len+1, std::ios::cur );

if (img_type !=TGA_RLE)
{
if (bits ==24 || bits ==32)
{
//initiate the image
i32 err =i.init(width, height, bits/8, file_name);
if (err !=ERR_OK)
{
s.close_file();
return err;
}

stride =i.channels()*i.width();

ui32 x,y;
ui8 *line =0;

for (y =0; y <i.height(); ++y)
{
line =&(i.m_data.at(stride*y));

s.read((void*)line, stride);

//make it rgb!!!!
for (x =0; x <stride; x+=i.channels())
{
temp =line[x];
line[x] =line[x+2];
line[x+2] =temp;
}
}
}
else if(bits ==16)
{
ui16 pixels;
i32 r,g,b;
ui32 i_3;

i32 err =i.init(width, height, 3, file_name);
if (err !=ERR_OK)
{
s.close_file();
return err;
}

stride =3*i.width();

for (idx =0; idx <(i.width()*i.height()); ++idx)
{
s.read(pixels);

r =(pixels & 0x1f) << 3;
g =((pixels >> 5) & 0x1f) << 3;
b =((pixels >> 10) & 0x1f) << 3;

i_3 =idx*3;

i.m_data.at(i_3) =r;
i.m_data.at(i_3+1) =g;
i.m_data.at(i_3+2) =b;
}
}
else
{
s.close_file();
return ERR_FAILED;
}
}
else
{
ui8 rle_id;
i32 colors_read =0;

i32 err =i.init(width, height, bits/8, file_name);
if (err !=ERR_OK)
{
s.close_file();
return err;
}

stride =i.channels()*i.width();

ui8 *colors =new ui8 [i.channels()];

while(idx <(i.width()*i.height()))
{
s.read(rle_id);

if (rle_id <128)
{
++rle_id;

while (rle_id)
{
s.read((void*)(colors), SIZE_OF_UI8*i.channels());

i.m_data.at(colors_read) =colors[2];
i.m_data.at(colors_read+1) =colors[1];
i.m_data.at(colors_read+2) =colors[0];
if (bits ==32)
i.m_data.at(colors_read+3) =colors[3];

++idx;
--rle_id;
colors_read +=i.channels();
}
}
else
{
rle_id -=127;

s.read((void*)(&colors), SIZE_OF_UI8*i.channels());

while (rle_id)
{
i.m_data.at(colors_read) =colors[2];
i.m_data.at(colors_read+1) =colors[1];
i.m_data.at(colors_read+2) =colors[0];
if (bits ==32)
i.m_data.at(colors_read+3) =colors[3];

++idx;
--rle_id;
colors_read +=i.channels();
}
}
}

delete [] colors;
colors =0;
}

s.close_file();

return ERR_OK;
}



The "serializer" class just contains an ifstream and an ofstream.

It could be that I have grown unable to see an obvious mistake. Sorry for the long chunk of code. I hate posting code. If anyone could take the time to look at it and tell me if something is wrong, or even better, if nothing is wrong, I would be most grateful. I have begun to question whether or not the TGA exporter for Photoshop 6.0 is buggy. The oddest thing is, I've never had trouble with my TGA loading code before. It has always worked in the past. The only thing I have changed in the loading code is the use of std::ifstream instead of all the fopen()-related functions.

To the original poster:
I don't mean to steal your thread. I'm having similar problems, and I thought that it might help if I discussed my experiences with the same problem.

Thank you to those who took the time to read this terribly long post.

Share this post


Link to post
Share on other sites
DudeMiester    156
Maybe the targas you downloaded are RLE compressed and you don't have code for that?

btw, if you have any of the Geforce 6xxx class cards then you can use non power of 2 textures. They at least make render to texture stuff a little simpler. I think the RADEON Xxxx also supports non power of 2 textures.

Share this post


Link to post
Share on other sites
rick_appleton    864
odiusangel: that's very odd. I've compared your code that reads the header with my code, and it's identical (unless your seek or read isn't working correctly). If you step-debug through the code with an erroneous file, does it read the width and bits there incorrectly? If so, you're welcome to send the .tga file my way and I'll give it a swing.

Share this post


Link to post
Share on other sites
extralongpants    704
Quote:
benwestgarth
odiusangel: that's very odd. I've compared your code that reads the header with my code, and it's identical (unless your seek or read isn't working correctly). If you step-debug through the code with an erroneous file, does it read the width and bits there incorrectly? If so, you're welcome to send the .tga file my way and I'll give it a swing.


Yeah, that is exactly what happens. I managed to create a non-working file. It should be 256x265 and 24 bit. Nothing shows up when I load it. You can download it here: members.lycos.co.uk/extralongpants/block.zip. Thank you for taking the time to help me out, I really appreciate it!

Share this post


Link to post
Share on other sites
rick_appleton    864
That file works fine for me clicky. And since the loading code seems correct (this is the first snippet of my loading code):


// Read in the length in bytes from the header to the pixel data
file.Read(&length, sizeof(byte));

// Jump over one byte
file.Seek(1, SEEK_CUR);

// Read in the imageType (RLE, RGB, etc...)
file.Read(&imageType, sizeof(byte));

// Skip past general information we don't care about
file.Seek(9, SEEK_CUR);

// Read the width, height and bits per pixel (16, 24 or 32)
file.Read(&width, sizeof(WORD));
file.Read(&height, sizeof(WORD));
file.Read(&bits, sizeof(byte));




I can only conclude that your file/stream reading code might not be working correctly. Possibly the seeks aren't working correctly?

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this