Sign in to follow this  

retrieving alpha value

This topic is 2659 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi,

im saving the data using this value:

glTexImage3D(GL_TEXTURE_3D, 0, GL_ALPHA8, textureWidth, textureHeight,
textureDepth, 0, GL_ALPHA, GL_UNSIGNED_BYTE, index3D);

but when i try to retrieve it in the GLSL using this syntax:

float currIndex = texture3D(texIndex, coord3Index).a;

but i cant get any of the value, it only give me 0 value. im pretty sure that the texIndex value is right. but im getting constant 0.

is there something wrong with my code?

thanks in advance

Share this post


Link to post
Share on other sites
What are you doing with this result, is it really supposed to be an index? Your result will always be normalized between 0 and 1, so if you're trying to treat it like some kind of integer index it will always look like 0 I think.

Just a guess based on your variable name. How do you know it is always 0?

Share this post


Link to post
Share on other sites
well so the value of the alpha will be the third coordinate(z coordinate/depth) of other texture3D (which is indexed from 0.0-1.0). but thats not something that i concern by now. i know its zero because when i call

if (currIndex == 0.0 ){
color = vec4(0.0, 0.5, 0.0, 1.0);
}

and its all painted green. i've checked the index3D value and I even set it to unsigned char 255 so it should be 1.0 in GLSL. but still its green. did i miss anything on the syntax?

Share this post


Link to post
Share on other sites
What happens if you try GL_RED instead of GL_ALPHA and read from the .r channel? I think GL_ALPHA was removed from opengl3.2, not sure what version you're using.

Are you checking for opengl errors?

Share this post


Link to post
Share on other sites
interestingly thats works :-s what happened? isnt r, g, b, a all represented with 8bit value?

okay now im moving it to 3D texture but i got a bad access while


GLuint depth = 10;
w = 100;
h = 100;
unsigned char* textureAlpha = new unsigned char[w*h*depth];

for(int j= 0; j<w*h*depth; j++){
textureAlpha[j]= 250;
}

glGenTextures(1, &texname3);
glBindTexture(GL_TEXTURE_3D, texname3);
glTexImage3D(GL_TEXTURE_3D,0,GL_RED, w, h, depth, 0, GL_RED, GL_UNSIGNED_BYTE, textureAlpha);
glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);




i've tried printing the whole textureAlpha and seems no problem. but when i try to copy it to GPU memory (glTexImage3D()) it gives me bad access runtime error. did i miss anything on the size? the source was working when i implement it with 2D (no depth). but when i add the depth (w*h*depth) it gives me that error. any advice?


thanks in advance

Share this post


Link to post
Share on other sites

This topic is 2659 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this