WebGL: hardware skinning with a bone texture

Started by
3 comments, last by Chananya Freiman 10 years, 9 months ago

I have WebGL code running hardware skinning in a vertex shader for animations, using a uniform array of matrices for my bones.

The problem arises when I don't have enough vector slots, which happens when there are more than 62 bones (I don't control the models or the number of bones, and I've already seen a model with 173 bones, which is crazy).

I tried using a float texture to store all my bones in, and fetch them in the shader, but I can't seem to do that correctly.

There is no texelFetch in WebGL's version of GLSL, no 1D textures and obviously no texture buffers or uniform buffers.

What I tried was creating a X on 1 2D float texture, where X is the number of floats required for all the matrices, feeding it with all the matrices.

I send to the shader the size of each matrix and the size of each vector, relative to the size of the texture, so I can get to any matrix with a normal texture fetch.

I believe this should work...in theory. But it doesn't.

This is the texutre initialization code:


var buffer = new Float32Array(...);
... 
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, buffer.byteLength / 16, 1, 0, gl.RGBA, gl.FLOAT, buffer);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.NEAREST);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.NEAREST);

And the part of the vertex shader that constructs a matrix from the texture, given a matrix index:


...
uniform sampler2D u_bone_map;
uniform float u_matrix_fraction;
uniform float u_vector_fraction;
...
mat4 boneMatrix(float bone) {
  return mat4(texture2D(u_bone_map, vec2(u_matrix_fraction * bone, 0)),
              texture2D(u_bone_map, vec2(u_matrix_fraction * bone + u_vector_fraction, 0)),
              texture2D(u_bone_map, vec2(u_matrix_fraction * bone + u_vector_fraction * 2.0, 0)),
              texture2D(u_bone_map, vec2(u_matrix_fraction * bone + u_vector_fraction * 3.0, 0)));
}
...

u_matrix_fraction and u_vector_fraction are the relative sizes I wrote above.

E.g., if the texture is 512x1 each matrix is 16/512, and each vector is 4/512, so to get the ith matrix, the code would need to go to i*u_matrix_fraction and grab 4 texels, each the size of u_vector_fraction.

These matrices result in my meshes going crazy, so something is wrong, but I am just not sure what.

Got any ideas?

Thanks for any help.

Advertisement

I'm intrigued... really, I'm surprised, when I first heard of WebGL I had assumed that texture access in the vertex shader would not be supported.

.....

Well after a quick two minutes of reading it seems that Chrome and Firefox are not holding firm to the OpenGL E.S. 2.0. spec. I guess you can do it.

Have you tried printing what you are putting into the texture out to a text file as well so you can see if the data makes sense? That's usually how I debug.

Consider it pure joy, my brothers and sisters, whenever you face trials of many kinds, 3 because you know that the testing of your faith produces perseverance. 4 Let perseverance finish its work so that you may be mature and complete, not lacking anything.

I don't have access to files, JavaScript sleep.png

In any case, the data is the same data I otherwise use in the matrix uniform array (where it works as expected), so it's correct.

There's probably something really stupid and obvious I am doing and not noticing.

Have you tried using something like the WebGL Inspector ?

I did not know of the existence of that inspector, I'll check it out.

Thanks. smile.png

/Edit

Thanks a lot, that inspector really helped. Got the skinning working this time!

This topic is closed to new replies.

Advertisement