Jump to content
  • Advertisement
Sign in to follow this  
_Flame_

OpenGL Convert D3DFMT_X8B8G8R8 to OpenGL format

Recommended Posts

Hello.

If i'm not mistaken X8B8G8R8 means that the first byte is present in a texture but shouldn't be used.

Or to put it the other way the texture doesn't have alpha channel but one pixel is still 4 bytes. I'm a little confused because of why wasting file space? 

But anyway after reading such texture from a file i don't see how to create an opengl texture without data conversion.

I can use GL_BGRA and turn off alpha/blending but it's a bit inconvenient if you are writing a library that returns a format to a user.

Any suggestions to that problem? Only conversion to three components and GL_BGR format?

Share this post


Link to post
Share on other sites
Advertisement
Posted (edited)

You're basically on the right track, the 4th 8-bit component is there in the data, but its value is not used for anything so doesn't matter. When related to graphics it's better to have 4 8-bit components and not use one than to have 3 (24 bits isn't great for a lot of hardware cases). However as GPUs have changed to general purpose computing hardware, those rigid design rules have also relaxed somewhat. For what you are doing with (nearly end-of-life) GL, I would just keep it as a 32-bit format and ignore the 4th byte.

Edited by Steve_Segreto

Share this post


Link to post
Share on other sites
Posted (edited)
6 hours ago, Steve_Segreto said:

You're basically on the right track, the 4th 8-bit component is there in the data, but its value is not used for anything so doesn't matter. When related to graphics it's better to have 4 8-bit components and not use one than to have 3 (24 bits isn't great for a lot of hardware cases). However as GPUs have changed to general purpose computing hardware, those rigid design rules have also relaxed somewhat. For what you are doing with (nearly end-of-life) GL, I would just keep it as a 32-bit format and ignore the 4th byte.

Why end of life? As far as I know opengl is not going to be deprecated but even more it will be maintained and new features will be added in future.

Back to my question. My confusion is not about format itself but more about wasting file space. We can keep 3 components in a file and then convert it to any format in memory.

Ignoring the last component if perfectly fine but it's not so simple when you need to give a user a format from a library. Imagine you are writing a library that is loading dx textures and converts them to opengl formats.
What format would it be? GL_BRGA would be wrong since it doesn't have valid alpha channel. This question is more about design decision since there is no direct mapping from dx to opengl formats.

Edited by _Flame_

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!