Archived

This topic is now archived and is closed to further replies.

BlackSheep

Radeon Texture Bit-depth

Recommended Posts

BlackSheep    100
For some reason, when creating textures using GL_RGB, my Radeon 9500 keeps spitting out textures that look as if they are 16bpp, despite being told to use 24/32bpp. I know there''s a trick using GL_RGBA8 that works with TNT cards, but it doesn''t help with the ATI. Is this an ''undocumented feature''?

Share this post


Link to post
Share on other sites
Fingers_    410
Apparently, if you use GL_RGB you are not telling it to use 24/32bpp. The current ATI drivers don''t have a setting for "default texture bit depth" that would let you specify what GL_RGB does, and the default seems to be 16-bit.

Just use GL_RGB8 or GL_RGBA8 to select the bit depth, it''s not a trick and should work on all cards. (unless your texture quality is specifically turned down in the drivers)

Share this post


Link to post
Share on other sites
BlackSheep    100
Looks like ATI just don''t like me. GL_RGB8 achieves the amazing result of turnign all my textures plain white; ie: killing them.

Way to go ATI. I buy your expensive hardware, to find that I have to put up with pink bands on my white lightmaps. Ta very much, like.

Share this post


Link to post
Share on other sites
JonnyQuest    331
quote:
Original post by BlackSheep
Looks like ATI just don''t like me. GL_RGB8 achieves the amazing result of turnign all my textures plain white; ie: killing them.

Way to go ATI. I buy your expensive hardware, to find that I have to put up with pink bands on my white lightmaps. Ta very much, like.


I seriously doubt that it''s ATI''s fault, I haven''t heard any complaints about OpenGL games not working. (hey, not even Carmack has complained, and he would!)
So I strongly suggest to re-read the docs, your code, etc. as there''s almost 100% something wrong on your end.
Unfortunately, it''s hard to help you as I don''t know what exactly you''re doing.


- JQ
Yes I do have holidays at the moment. And yes, that does explain the increased posting.

Share this post


Link to post
Share on other sites
BlackSheep    100

glTexImage2D (GL_TEXTURE_2D, 0, GL_RGB8, width, height, 0, GL_RGB8, GL_UNSIGNED_BYTE, data);

gluBuild2DMipmaps(GL_TEXTURE_2D, 3, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);


OK, these are the lines causing me grief. ''data'' is normal rgb data (no alpha channel). Switching to GL_RGB8 in the Mipmap call gives me white textures.

It''s quite frustrating this. I wouldn''t mind if the driver dithered my grayscale lightmaps, but for some reason it''s tinting them slightly red in places, so I have what are effectively pink Mach bands all over the place.

I''m using ATI''s latest drivers for XP, within which I can find no setting for internal texture formats or quality.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
Try:

gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB8, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);

Share this post


Link to post
Share on other sites
BlackSheep    100
quote:
Original post by Anonymous Poster
Try:

gluBuild2DMipmaps(GL_TEXTURE_2D, GL_RGB8, width, height, GL_RGB, GL_UNSIGNED_BYTE, data);


YAY!!!

Thanks a lot, that fixes it.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster   
Guest Anonymous Poster
quote:
Original post by BlackSheep
YAY!!!

Thanks a lot, that fixes it.
Glad i could help...:-)

Share this post


Link to post
Share on other sites