Sign in to follow this  
Promit

Bizarre geometry corruption due to no-op read in VS

Recommended Posts

I have this vertex shader:
#version 110

uniform mat4 theMatrix[13];

attribute vec3 a_vertex;
attribute vec2 a_index;
attribute vec2 a_weights;

void main(void)
{
	//theMatrix[int(a_index.x)];
	gl_Position = gl_ModelViewProjectionMatrix * vec4(a_vertex.xyz, 1.0);
}


Note the commented line, which should by all rights be a no-op. As written, this produces (flat color PS, wireframe): Now, I uncomment it: It's a little difficult to see, but basically all his polys have been rolled into a ball. A perfect sphere, in fact. All the polys seems to still be there, but they've all collapsed into this suspiciously convenient structure. What the hell? System info: Windows 7 64 bit Radeon HD 4830, latest official ATI drivers as of today [Edited by - Promit on February 24, 2010 2:55:09 PM]

Share this post


Link to post
Share on other sites
Maybe but for that line, a_index is going unused, so it doesn't have a binding, which is confusing your host program?

Share this post


Link to post
Share on other sites
I think that must be related somehow -- I've dropped the read and it still gets mucked up, but I'm not too sure exactly what is happening still.

Share this post


Link to post
Share on other sites
What's the code being used to match up vertex data streams? It may come down to data streams corresponding to inactive attributes being accidentally inserted, moving the other data streams out of sync. It really looks like some normalized vector attribute is being reinterpreted as a_vertex.

Share this post


Link to post
Share on other sites
This is the binding code, and then the code that sets up the VBOs:

glBindAttribLocation(glslContext, 0, "a_vertex");
glBindAttribLocation(glslContext, 1, "a_normal");
glBindAttribLocation(glslContext, 2, "a_weights");
glBindAttribLocation(glslContext, 3, "a_index");
glLinkProgram(glslContext);

glBindBuffer(GL_ARRAY_BUFFER, VBO_vertex[index]);
glVertexAttribPointer((GLuint)0, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_normal[index]);
glVertexAttribPointer((GLuint)1, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matWeights[index]);
glVertexAttribPointer((GLuint)2, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matIndices[index]);
glVertexAttribPointer((GLuint)3, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, VBO_index[index]);


Quote:
It really looks like some normalized vector attribute is being reinterpreted as a_vertex.
Good point. Considering it's a perfect sphere, it looks like the normals are being read as positions. I must be misunderstanding something about how this whole binding things works.

Share this post


Link to post
Share on other sites
Found it, I think -- the binding was happening on a bad program object. So I guess the data was just getting over to the shader by pure chance.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this