Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

elgransan

screen pixelformat confusion

This topic is 5453 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I cant figure out with this, my game works properly in 32bits mode but because not all videocards supports this videomode I want to add the 16bits screen resolution support, and I have this problem: 16bits mode is normally R5G6B5 and Im using the A8 value for transparencies, so, how can I solve that??? I try to set R4G4B4A4 pixelformat but it doesnt work. I think that my mistake was on using the alpha channel 32bits mode, what do the truely games programmers do?

Share this post


Link to post
Share on other sites
Advertisement
Why do you need an alpha channel on the frame buffer? It seems kinda unnecessary to me, unless you''re doing some weird operations on the screen with alpha tests and soforth.

Share this post


Link to post
Share on other sites
mmm... you are right, i dont need an alpha channel in the frame buffer, i only need it in the textures... damn it!

So now i have another question, if i set a 16bits mode, i get a R5G6B5 framebuffer, so i have to convert my 32bits images to that pixelformat, no?... and, what happend with the alpha channel?... i have to create a R5G6B5A8 texture or A4 or A5 ...or what?... im confuced

Thanks for the reply and sorry for my english, it took to me about 10min to write this 3 lines :S

[edited by - elgransan on August 17, 2003 4:19:23 PM]

Share this post


Link to post
Share on other sites
You can store your textures in whatever format you like, but before you blit them to the screen, you''ll have to multiply the color values by the alpha that you want on the texture (either using the textures alpha channel, or a global alpha constant, or a combination of the two).

Share this post


Link to post
Share on other sites
Ok, I understood that.

So now, technically, how can do that?:


glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, OBK_TEXTURE_SIZE, OBK_TEXTURE_SIZE, 0, GL_RGB, GL_UNSIGNED_BYTE, pubImageData);

witch Format, internalFormat and data type I have to select to create a R5G6B5AX texture??? (where X >=4)

thanks a lot MrOreo & DeathWish for your advice!

Share this post


Link to post
Share on other sites
The format and data type is independent of the pixel format. Just leave those as they are, the data will be converted by OpenGL if external and internal format doesn''t match. The internal format should also be left as it is. GL_RGB is a generic format as it doesn''t specify any specific size and this lets the driver choose the most proper internal format itself. Specifying size explicitly (GL_RGB8 for example) is only a hint anyway and the driver it free to ignore it and treat it as a generic format.

Share this post


Link to post
Share on other sites
ok bob, but if I left it as they are, I get wrong colors and the performance goes down a lot, ej at 800x600x32 i get with a geforce2 250fps and at the same resolution but 16bits i get from 1 to 2 frames and wrong colors, so, i thought that ogl do the conversion very bad... whats happend???... ogl hates me?

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!