Sign in to follow this  

GL_RGBA16F_ARB vs GL_RGBA16

This topic is 3292 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

To the best of my knowledge, GL_RGBA16 is just like any other non float format, but with higher precision. So it will still clamp your values to [0, 1]. It is also more widely supported, as it has been around since GL 1.1.

The big distinction is that GL_RGBA16F can have values that aren't limited to the range [0, 1]. However, it has fewer platforms that support it, or only support the format but not accessory features (blending, filtering, etc).

That's my current understanding of the two, so it may not be 100% correct. However, from my own small tests, that seems describe the key differences.

Share this post


Link to post
Share on other sites
I must make a correction to my prior post, based on the following thread, and the linked document.

http://www.gamedev.net/community/forums/topic.asp?topic_id=454343
http://developer.nvidia.com/object/nv_ogl_texture_formats.html

At least in the case of nvidia hardware, RGBA16 targets are internally represented as RGBA8 targets.

Share this post


Link to post
Share on other sites

This topic is 3292 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this