screen pixelformat confusion

Started by
5 comments, last by elgransan 20 years, 8 months ago
Hello, I cant figure out with this, my game works properly in 32bits mode but because not all videocards supports this videomode I want to add the 16bits screen resolution support, and I have this problem: 16bits mode is normally R5G6B5 and Im using the A8 value for transparencies, so, how can I solve that??? I try to set R4G4B4A4 pixelformat but it doesnt work. I think that my mistake was on using the alpha channel 32bits mode, what do the truely games programmers do?
Advertisement
Why do you need an alpha channel on the frame buffer? It seems kinda unnecessary to me, unless you''re doing some weird operations on the screen with alpha tests and soforth.
mmm... you are right, i dont need an alpha channel in the frame buffer, i only need it in the textures... damn it!

So now i have another question, if i set a 16bits mode, i get a R5G6B5 framebuffer, so i have to convert my 32bits images to that pixelformat, no?... and, what happend with the alpha channel?... i have to create a R5G6B5A8 texture or A4 or A5 ...or what?... im confuced

Thanks for the reply and sorry for my english, it took to me about 10min to write this 3 lines :S

[edited by - elgransan on August 17, 2003 4:19:23 PM]
You can store your textures in whatever format you like, but before you blit them to the screen, you''ll have to multiply the color values by the alpha that you want on the texture (either using the textures alpha channel, or a global alpha constant, or a combination of the two).
-=|Mr.Oreo|=-Code Monkey, Serpent Engine
Ok, I understood that.

So now, technically, how can do that?:


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, OBK_TEXTURE_SIZE, OBK_TEXTURE_SIZE, 0, GL_RGB, GL_UNSIGNED_BYTE, pubImageData);

witch Format, internalFormat and data type I have to select to create a R5G6B5AX texture??? (where X >=4)

thanks a lot MrOreo & DeathWish for your advice!
The format and data type is independent of the pixel format. Just leave those as they are, the data will be converted by OpenGL if external and internal format doesn''t match. The internal format should also be left as it is. GL_RGB is a generic format as it doesn''t specify any specific size and this lets the driver choose the most proper internal format itself. Specifying size explicitly (GL_RGB8 for example) is only a hint anyway and the driver it free to ignore it and treat it as a generic format.
ok bob, but if I left it as they are, I get wrong colors and the performance goes down a lot, ej at 800x600x32 i get with a geforce2 250fps and at the same resolution but 16bits i get from 1 to 2 frames and wrong colors, so, i thought that ogl do the conversion very bad... whats happend???... ogl hates me?

This topic is closed to new replies.

Advertisement