Sign in to follow this  

Bizarre geometry corruption due to no-op read in VS

This topic is 2854 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have this vertex shader:
#version 110

uniform mat4 theMatrix[13];

attribute vec3 a_vertex;
attribute vec2 a_index;
attribute vec2 a_weights;

void main(void)
{
	//theMatrix[int(a_index.x)];
	gl_Position = gl_ModelViewProjectionMatrix * vec4(a_vertex.xyz, 1.0);
}


Note the commented line, which should by all rights be a no-op. As written, this produces (flat color PS, wireframe): Now, I uncomment it: It's a little difficult to see, but basically all his polys have been rolled into a ball. A perfect sphere, in fact. All the polys seems to still be there, but they've all collapsed into this suspiciously convenient structure. What the hell? System info: Windows 7 64 bit Radeon HD 4830, latest official ATI drivers as of today [Edited by - Promit on February 24, 2010 2:55:09 PM]

Share this post


Link to post
Share on other sites
Maybe but for that line, a_index is going unused, so it doesn't have a binding, which is confusing your host program?

Share this post


Link to post
Share on other sites
What's the code being used to match up vertex data streams? It may come down to data streams corresponding to inactive attributes being accidentally inserted, moving the other data streams out of sync. It really looks like some normalized vector attribute is being reinterpreted as a_vertex.

Share this post


Link to post
Share on other sites
This is the binding code, and then the code that sets up the VBOs:

glBindAttribLocation(glslContext, 0, "a_vertex");
glBindAttribLocation(glslContext, 1, "a_normal");
glBindAttribLocation(glslContext, 2, "a_weights");
glBindAttribLocation(glslContext, 3, "a_index");
glLinkProgram(glslContext);

glBindBuffer(GL_ARRAY_BUFFER, VBO_vertex[index]);
glVertexAttribPointer((GLuint)0, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_normal[index]);
glVertexAttribPointer((GLuint)1, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matWeights[index]);
glVertexAttribPointer((GLuint)2, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matIndices[index]);
glVertexAttribPointer((GLuint)3, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, VBO_index[index]);


Quote:
It really looks like some normalized vector attribute is being reinterpreted as a_vertex.
Good point. Considering it's a perfect sphere, it looks like the normals are being read as positions. I must be misunderstanding something about how this whole binding things works.

Share this post


Link to post
Share on other sites

This topic is 2854 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this