Jump to content
  • Advertisement
Sign in to follow this  
Promit

Bizarre geometry corruption due to no-op read in VS

This topic is 3066 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have this vertex shader:
#version 110

uniform mat4 theMatrix[13];

attribute vec3 a_vertex;
attribute vec2 a_index;
attribute vec2 a_weights;

void main(void)
{
	//theMatrix[int(a_index.x)];
	gl_Position = gl_ModelViewProjectionMatrix * vec4(a_vertex.xyz, 1.0);
}


Note the commented line, which should by all rights be a no-op. As written, this produces (flat color PS, wireframe): Now, I uncomment it: It's a little difficult to see, but basically all his polys have been rolled into a ball. A perfect sphere, in fact. All the polys seems to still be there, but they've all collapsed into this suspiciously convenient structure. What the hell? System info: Windows 7 64 bit Radeon HD 4830, latest official ATI drivers as of today [Edited by - Promit on February 24, 2010 2:55:09 PM]

Share this post


Link to post
Share on other sites
Advertisement
Maybe but for that line, a_index is going unused, so it doesn't have a binding, which is confusing your host program?

Share this post


Link to post
Share on other sites
I think that must be related somehow -- I've dropped the read and it still gets mucked up, but I'm not too sure exactly what is happening still.

Share this post


Link to post
Share on other sites
What's the code being used to match up vertex data streams? It may come down to data streams corresponding to inactive attributes being accidentally inserted, moving the other data streams out of sync. It really looks like some normalized vector attribute is being reinterpreted as a_vertex.

Share this post


Link to post
Share on other sites
This is the binding code, and then the code that sets up the VBOs:

glBindAttribLocation(glslContext, 0, "a_vertex");
glBindAttribLocation(glslContext, 1, "a_normal");
glBindAttribLocation(glslContext, 2, "a_weights");
glBindAttribLocation(glslContext, 3, "a_index");
glLinkProgram(glslContext);

glBindBuffer(GL_ARRAY_BUFFER, VBO_vertex[index]);
glVertexAttribPointer((GLuint)0, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_normal[index]);
glVertexAttribPointer((GLuint)1, 3, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matWeights[index]);
glVertexAttribPointer((GLuint)2, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ARRAY_BUFFER, VBO_matIndices[index]);
glVertexAttribPointer((GLuint)3, 2, GL_FLOAT, GL_FALSE, 0, 0);

glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, VBO_index[index]);


Quote:
It really looks like some normalized vector attribute is being reinterpreted as a_vertex.
Good point. Considering it's a perfect sphere, it looks like the normals are being read as positions. I must be misunderstanding something about how this whole binding things works.

Share this post


Link to post
Share on other sites
Found it, I think -- the binding was happening on a bad program object. So I guess the data was just getting over to the shader by pure chance.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!