Jump to content

  • Log In with Google      Sign In   
  • Create Account


Is it still possible to send 8-bit textures to the GPU?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 MarkS   Prime Members   -  Reputation: 880

Like
0Likes
Like

Posted 14 April 2013 - 03:14 PM

I'm working on a font rendering library and the char textures just effect the alpha component with the character color being sent via a uniform variable. Since the alpha component of the texture is 8 bits and other components are unused, it makes sense to just create and send an 8-bit texture. The problem is that I am sticking with the 3.3 core profile and that means that I no longer have access to the GL_ALPHA and other 8-bit texture formats. I thought about packing four pixels into one 32-bit components, but I cannot think of a way to unpack this in the shader.

 

Am I stuck sending 32-bit when I need only 8 or am I missing something?



Sponsor:

#2 Brother Bob   Moderators   -  Reputation: 7929

Like
2Likes
Like

Posted 14 April 2013 - 04:13 PM

Use the GL_RED format for single channel textures.



#3 MarkS   Prime Members   -  Reputation: 880

Like
0Likes
Like

Posted 14 April 2013 - 04:16 PM

OK, let me ask you this. Why was GL_ALPHA removed and GL_RED left? If I use GL_RED now, what are the chances that it will become depreciated in the near future? I saw GL_RED, but its use mystifies me, so I'm not sure how long it will stay.



#4 Brother Bob   Moderators   -  Reputation: 7929

Like
4Likes
Like

Posted 14 April 2013 - 04:51 PM

The old API had several types of textures for different meanings of the same single-channel data. You had GL_ALPHA for alpha only channels, GL_LUMINANCE grey scale with unit alpha channel, and GL_INTENSITY for grey scale and alpha. Since you couldn't program the pipeline with the same flexibility as before, you had to give meaning to the texture data to ensure that reading a single channel texture would populate the four color channels with the proper data. For example, a GL_LUMINANCE texture value of L would generate the RGBA color value (L, L, L, 1), but you had to use an different texture format if you instead wanted the RGBA color value (L, L, L, L) (note the difference; 1 or L for the alpha channel).

 

With today's flexible programmable pipeline, you have no longer the need to tell OpenGL your intended use of the texture data. You just expand the channels as you like with the swizzle operator. OpenGL no longer has to know about your intent with the texture, only your shader has to.

 

There is no reason any more to have three different formats for the same single-channel texture. Just do whatever you want with the single channel in your shader.

 

The chance of GL_RED being deprecated is zero, until the entire API fundamentally changes again as it did with 3.0 and the core profile. GL_RED is the way to specify a single channel texture, just like GL_RG, GL_RGB and GL_RGBA are the ways to specify a 2, 3 and 4 channel texture, respectively.



#5 MarkS   Prime Members   -  Reputation: 880

Like
0Likes
Like

Posted 16 April 2013 - 03:32 PM

Thank you so much! Very helpful!

#6 Hodgman   Moderators   -  Reputation: 28772

Like
0Likes
Like

Posted 16 April 2013 - 05:46 PM

Under the API/driver, the hardware will certainly support formats with 8, 16, 32, 64 and 128 bit strides between texels, and that's not likely to change any time soon either.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS