I am writing some GLSL skeletal animation code, and I am passing the weight information onto the vertex shader as unsigned bytes something like this:
glVertexAttribPointer ( weight_indices, 4, GL_UNSIGNED_BYTE, GL_FALSE, 8 , ( ( char * ) NULL + ( 0 ) ) );
glVertexAttribPointer ( weights, 4, GL_UNSIGNED_BYTE, GL_TRUE, 8 , ( ( char * ) NULL + ( 4 ) ) );
On the shader (I am using #version 140) I catch those like so:
in uvec4 weight_indices; in vec4 weights;
The weights come out right, normalization turns them into a float in the range [0,1] and they add up to 1 or whereabouts, however weight_indices is not getting the values I expect, that being a simple cast from uint8_t to uint32_t.
I think there must be something I am missing, if I read the manual page for glVertexAttribPointer, int values are converted to floats, so there may be some sort of casting issue when the float is cast back to int... but I am just assuming here... any ideas?