Jump to content
  • Advertisement
Sign in to follow this  
wacco

OpenGL GLSL dynamic size mat4 array

This topic is 4490 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hiya, I'm playing around a bit with 'my first shader' in GLSL, and I bumped into my first few problems. One I 'hacked' around, but the other one keeps me clueless. In short; I'm trying to do character animation (rigging) on the GPU. Which means moving my matrices to it, and do the maths there. Now, I haven't actually figured out all the appropiate functions yet so that part's missing, but I'm already having problems enough with those actual matrices. The thing is that I simply don't know how many matrices I'll have, and as far as I know GLSL doesn't have pointer magic (which I could really use here). And I don't want to give my number of matrices away in source (since it can change any moment by switching model, and I don't want to have a 'hard limit'). Now, this should be possible if I understand the GLSL doc correctly:
Quote:
http://oss.sgi.com/projects/ogl-sample/registry/ARB/GLSLangSpec.Full.1.10.59.pdf 4.1.9 - Arrays: If an array is indexed with an expression that is not an integral constant expression or passed as an argument to a function, then its size must be declared before any such use. It is legal to declare an array without a size and then later re-declare the same name as an array of the same type and specify a size.
Which makes me give it a try like this;
varying vec3 normal, lightDir;
attribute float bonenum;
uniform mat4 trix[];

void main() {	
	int bonenum2 = int(bonenum);
	mat4 trix[bonenum2]; // error here

	lightDir = normalize(vec3(gl_LightSource[0].position));
	normal = normalize(gl_NormalMatrix * gl_Normal);
		
//	gl_Position = ftransform();
//	gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
	gl_Position = gl_ModelViewProjectionMatrix * trix[bonenum2] * gl_Vertex;
}
I'm not sure if I declare it correctly since a mat4 is a 2 dimensional array in the first place. Anyhow, the line marked 'error here' throws 'constant expression required' in my shader builder (I'm working on a mac) but according to the doc that's not needed.. or am I reading it wrong? It doesn't exactly specify it can be a variable size, I get that much. Basically I'm setting the array of mat4 at once, and then per vertex I pass on an attribute telling the GPU what matrix to use. I'm not passing the size of the array anywhere, but that shouldn't be a problem (as long as I don't have to recompile the shader etc, but can set it as a uniform var). The conversion bonenum <-> bonenum2 is my other 'problem', I can't pass integers as attributes. But a double cast (one in openGL, one here) seems to solve that. It ain't perfect, but it'll work for now. Can anybody enlighten me how this is supposed to work / what I'm doing wrong? Cheers!

Share this post


Link to post
Share on other sites
Advertisement
D'oh! I'm so blind sometimes. A few lines later from my quote:
Quote:

There is no mechanism for initializing arrays at declaration time from within a shader.

Which makes my attempts futile, stupid, and pointless. Damn! :(

Let me just give this topic a little twist; does anyone know a proper way around this? Otherwise I'll just have to go for a hard limit of 20 matrices or so.. far from perfect, but I am sort of running out of options after all. :\

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!