RGBA makes my textures look worse.

Started by
4 comments, last by stefu 22 years, 9 months ago
Why when I switch from RGB to RGBA image format, my textures look very ill? You can look at from followin link: www.geocities.com/stefankjb/ogl/weird.html Net Racing Demo: http://www.geocities.com/stefankjb/download.html
Advertisement
What''s the bit-depth of the textures? It looks to me like you''re using 16-bit textures. In RGB, 16-bit textures will be R = 5 bits, G = 5 or 6, B = 5 bits. In RGBA mode, 16-bit can use one of two masks: ARGB = 1555 or ARGB = 4444. If it''s the second case, you have half as many possible colours in the texture, hence the ugly banding. If you want RGBA textures, make sure they are 32-bit.

Are you using opengl? If so, the way to make 32-bit textures is to use:

  glTexImage2D( GL_TEXTURE_2D, GL_RGBA8, ... );  


However, if your display mode is 16-bit, some graphics cards will not let you use 32-bit textures, and so opengl will automatically pick the next best mode (probably GL_RGBA4 which is the ugly mode I described before). If this is the case, change the display mode to 32-bit. If you do that, just using GL_RGBA should give you 32-bit textures.

War Worlds - A 3D Real-Time Strategy game in development.
I''m using OpenGL and I have Voodoo3 2000.
It must be that ugly mode. But it didn''t help if I used 32 bit mode or GL_RGBA8. The result was the same ugly. Might it be the Voodoo, I think it''s very much 16 bit card?


Net Racing Demo:
http://www.geocities.com/stefankjb/download.html
Maybe, though I though the newer voodoo cards were 32-bit?

Anyway, try GL_A1RGB8. It only gives you 1 bit of alpha, but if you''re stuck with 16-bit then it''s your best bet.

Did you try setting the display mode to 32-bit?

War Worlds - A 3D Real-Time Strategy game in development.
I tried the 32 bit mode. If I used GL_RGBA8 in glTexImage2D(...) all textures were white (there weren''t any textures then). In 32 bit mode the GL_RGBA gave the same ugly look. I tried a lot of combinations but didn''t achieve better quality with alpha.

Using GL_RGB5_A1 gave all white so textures were not created!

Of course this is not so big problem because I''m using alpha only with some textures (like fonts).

By the way, is my call to glTexImage2D correct:

  glTexImage2D(    GL_TEXTURE_2D,     0, // base level    4, // number of components    pImage->GetWidth(), pImage->GetHeight(), 0, // w,h,border    GL_RGBA, // format    GL_UNSIGNED_BYTE, //type    pImage->GetData() // data    );  

My data is in format r8g8b8a8. I changed the format (to GL_RGB5_A1 ...). Does the format mean the format for OpenGL or the format of pImage->GetData()?
There''s also the parameter number of components: I saw someone used there GL_RGB but msdndoc says it is number of components. Again does it mean target or source components?


Net Racing Demo:
http://www.geocities.com/stefankjb/download.html
voodoo1-3 r 16bit voodoo4-5 are 32 bit.
check the internal texture format demo on my site but basically what happens is when u ask for a RGB texture youll be given something like a R5G6B5 texture + if u ask for a RGBA texture youll get something like R4G4B4A4. ie each colour gets fewer bits thus will look worse

http://members.xoom.com/myBollux

This topic is closed to new replies.

Advertisement