Radeon Texture Bit-depth

Started by
6 comments, last by BlackSheep 20 years, 7 months ago
For some reason, when creating textures using GL_RGB, my Radeon 9500 keeps spitting out textures that look as if they are 16bpp, despite being told to use 24/32bpp. I know there''s a trick using GL_RGBA8 that works with TNT cards, but it doesn''t help with the ATI. Is this an ''undocumented feature''?
Advertisement
Apparently, if you use GL_RGB you are not telling it to use 24/32bpp. The current ATI drivers don''t have a setting for "default texture bit depth" that would let you specify what GL_RGB does, and the default seems to be 16-bit.

Just use GL_RGB8 or GL_RGBA8 to select the bit depth, it''s not a trick and should work on all cards. (unless your texture quality is specifically turned down in the drivers)
Looks like ATI just don''t like me. GL_RGB8 achieves the amazing result of turnign all my textures plain white; ie: killing them.

Way to go ATI. I buy your expensive hardware, to find that I have to put up with pink bands on my white lightmaps. Ta very much, like.
quote:Original post by BlackSheep
Looks like ATI just don''t like me. GL_RGB8 achieves the amazing result of turnign all my textures plain white; ie: killing them.

Way to go ATI. I buy your expensive hardware, to find that I have to put up with pink bands on my white lightmaps. Ta very much, like.

I seriously doubt that it''s ATI''s fault, I haven''t heard any complaints about OpenGL games not working. (hey, not even Carmack has complained, and he would!)
So I strongly suggest to re-read the docs, your code, etc. as there''s almost 100% something wrong on your end.
Unfortunately, it''s hard to help you as I don''t know what exactly you''re doing.


- JQ
Yes I do have holidays at the moment. And yes, that does explain the increased posting.
~phil
glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_RGB8, GL_UNSIGNED_BYTE, data);gluBuild2DMipmaps(GL_TEXTURE_2D, 3, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);


OK, these are the lines causing me grief. ''data'' is normal rgb data (no alpha channel). Switching to GL_RGB8 in the Mipmap call gives me white textures.

It''s quite frustrating this. I wouldn''t mind if the driver dithered my grayscale lightmaps, but for some reason it''s tinting them slightly red in places, so I have what are effectively pink Mach bands all over the place.

I''m using ATI''s latest drivers for XP, within which I can find no setting for internal texture formats or quality.
Try:

gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB8, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);
quote:Original post by Anonymous Poster
Try:

gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB8, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);


YAY!!!

Thanks a lot, that fixes it.
quote:Original post by BlackSheep
YAY!!!

Thanks a lot, that fixes it.
Glad i could help...:-)

This topic is closed to new replies.

Advertisement