16Bit Monochrome Texture Problem

Started by
6 comments, last by randomname 19 years, 7 months ago
I cannot seem to get opengl to display 16Bit monochrome textures. I fill the buffer properly and use GL_UNSIGNED_SHORT as my buffer layout argument to gluBuild2DMipMaps or glTexImage2D. Still, the data gets truncated at the one byte level when the texture is built.
Advertisement
And what is the internal format you specified? Plain 8 bits per R, G and B perhaps? Does the hardware you use even support anything above 8 bit? :)
I have experimented with all of the possible internal formats. The only one I have have any success with for this application is GL_LUMINANCE because I don't want my buffer to be color. I need 16 Bits in one component. My app uses 32 Bit Color, so I know the hardware will support enough bits. It just seems as though opengl or the hardware is hard coded to allocate only 8 Bits for each component of R, G, B, and A and will not allow me to use only 16 Bits total and allocate it to one color component (monochrome).
GL_LUMINANCE is 8 bit format. Use GL_LUMINANCE16. (provided that you hardware can handle it).

Also, it is not part of ogl1.2 - you probably need a extension. But with any non-ancient hardware it should be in base ogl. (At least it is with ogl1.5)
Quote:Original post by boxsmiley
/.../ display 16Bit monochrome textures. /.../
Hm... just noticed. A typo or are you really expecting your monitor to SHOW with the resolution of 16bit/channel? (nearly all monitors are 8 bit per color channel).
thanks for the responses, random. in answer to your question, i am merely trying to accurately represent values of 0-65535 in monochrome as a ogl texture. i have only successfully done values of 0-255. i have queried a variety of different hardware using glxgetconfig and i get values like 5,5,5,0 for RGBA respectively. on my gf4 i think i got 5,6,5,0. i have tried GL_LUMINANCE16 and my texture did not get created successfuly. it seems like i should have enough bits available to store this information. i just cannot get opengl to acknowlege it.
Well, i have never had any nvidia hardware ... but gf4 sounds like luminance16 should not be a problem. (glxgetconfig is unknown to me - using win)

Maybe posting the relevant code helps (especially the lines that send the texture to ogl and set its parameters).

What ogl version you have?

Also - what is the ogl error after failed texture creation?

(just in case: internal and external format should be luminance16 and dataformat should be unsignedshorts)
Originally posted by boxsmiley at www.opengl.org:
[QB]does this seem like a valid test?[/QB]

Nope.

(hm. didn't notice that you crossposted it - doesn't matter)

It seems that you DO expect the monitor to show more than 8 bits per channel!!! The test you do doesn't show anything. As noted at ogl.org forum - you can use fragment program to test or to read the texture back. Drawing it on screen doesn't tell anything.

This topic is closed to new replies.

Advertisement