Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Someg

Texture internal format question

This topic is 5872 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When you are using glTexImage2D to create a texture, you specify in parameter 3 (internalFormat) the way you want your texture to be stored internally by OpenGL. So GL_RGB means 24bit color, GL_RGBA means 32 etc... What happens if the format you specify is not supported by the hardware? Will opengl always selects the "best fit" supported by the hardware for your case, or leave it as you selected and uses software rendering instead? If it's the second case, is there a function or something that you can use to examine which formats are hardware supported? Thank you. btw, I've tried almost all the available internal formats for a texture and it seems to me that there is no difference in speed or appearence, so I believe that OpenGL selects the "best fit" for internal format. Does this happens on every card, or it is just up to the driver? [edited by - Someg on August 26, 2002 7:58:36 PM]

Share this post


Link to post
Share on other sites
Advertisement

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!