Sign in to follow this  
asafcarmi

16bit monochrome textures

Recommended Posts

I'm trying to load a 16bit monochrome texture using the pbuffer, and than read it back to memory. I tried several configurations, but none worked good. I'm using resident textures initialized with this command:
glTexImage2(GL_TEXTURE_2D,0,GL_LUMINANCE_ALPHA,2048,2048,0,GL_LUMINANCE,GL_UNSIGNED_SHORT, NULL);
when loading the texture, I'm using:
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, w, h, GL_LUMINANCE, GL_UNSIGNED_SHORT, imgBuffer);
when I'm reading the pixels back to the memory, I see that all the components are the same (r,g,b) but wrong. I'm using:
glReadPixels(0, 0, x, y, GL_GREEN, GL_UNSIGNED_SHORT, imgBuffer);
. I tried several internal formats, such as GL_LUMINANCE, GL_LUMINANCE16(gave only '0'), GL_LUMINANCE16_ALPHA16 etc. each gave different output. I also tried initializing the pbuffer with different configurations. using wglChoosePixelFormatARB, I can't get a configuration with 16bit color components when I'm using these attributes: WGL_DRAW_TO_BUFFER_ARB,true WGL_COLOR_BITS_ARB,64 WGL_GREEN_BITS_ARB,16 WGL_RED_BITS_ARB,16 WGL_BLUE_BITS_ARB,16 WGL_ALPHA_BITS_ARB,16 WGL_DEPTH_BITS_ARB,24 WGL_ACCELERATION_ARB, WGL_FULL_ACCELERATION_ARB when using wglChossePixelFormatARB with attributes, I can only get formats with 8bits per component, but when I'm calling the function without attributes, I can get the above configuration. I tried to force this configuration, but I still got wrong values. for example, if my texture was 0xFF or 0xF0, I got back in the imgBuffer 257. if it was 0xF I got "0". I'm using Geforce 6200

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this