Jump to content
  • Advertisement
Sign in to follow this  
boxsmiley

OpenGL 16Bit Monochrome Texture Problem

This topic is 5429 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I cannot seem to get opengl to display 16Bit monochrome textures. I fill the buffer properly and use GL_UNSIGNED_SHORT as my buffer layout argument to gluBuild2DMipMaps or glTexImage2D. Still, the data gets truncated at the one byte level when the texture is built.

Share this post


Link to post
Share on other sites
Advertisement
And what is the internal format you specified? Plain 8 bits per R, G and B perhaps? Does the hardware you use even support anything above 8 bit? :)

Share this post


Link to post
Share on other sites
I have experimented with all of the possible internal formats. The only one I have have any success with for this application is GL_LUMINANCE because I don't want my buffer to be color. I need 16 Bits in one component. My app uses 32 Bit Color, so I know the hardware will support enough bits. It just seems as though opengl or the hardware is hard coded to allocate only 8 Bits for each component of R, G, B, and A and will not allow me to use only 16 Bits total and allocate it to one color component (monochrome).

Share this post


Link to post
Share on other sites
GL_LUMINANCE is 8 bit format. Use GL_LUMINANCE16. (provided that you hardware can handle it).

Also, it is not part of ogl1.2 - you probably need a extension. But with any non-ancient hardware it should be in base ogl. (At least it is with ogl1.5)

Share this post


Link to post
Share on other sites
Quote:
Original post by boxsmiley
/.../ display 16Bit monochrome textures. /.../
Hm... just noticed. A typo or are you really expecting your monitor to SHOW with the resolution of 16bit/channel? (nearly all monitors are 8 bit per color channel).

Share this post


Link to post
Share on other sites
thanks for the responses, random. in answer to your question, i am merely trying to accurately represent values of 0-65535 in monochrome as a ogl texture. i have only successfully done values of 0-255. i have queried a variety of different hardware using glxgetconfig and i get values like 5,5,5,0 for RGBA respectively. on my gf4 i think i got 5,6,5,0. i have tried GL_LUMINANCE16 and my texture did not get created successfuly. it seems like i should have enough bits available to store this information. i just cannot get opengl to acknowlege it.

Share this post


Link to post
Share on other sites
Well, i have never had any nvidia hardware ... but gf4 sounds like luminance16 should not be a problem. (glxgetconfig is unknown to me - using win)

Maybe posting the relevant code helps (especially the lines that send the texture to ogl and set its parameters).

What ogl version you have?

Also - what is the ogl error after failed texture creation?

(just in case: internal and external format should be luminance16 and dataformat should be unsignedshorts)

Share this post


Link to post
Share on other sites
Originally posted by boxsmiley at www.opengl.org:
[QB]does this seem like a valid test?[/QB][/QUOTE]

Nope.

(hm. didn't notice that you crossposted it - doesn't matter)

It seems that you DO expect the monitor to show more than 8 bits per channel!!! The test you do doesn't show anything. As noted at ogl.org forum - you can use fragment program to test or to read the texture back. Drawing it on screen doesn't tell anything.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!