OpenGL suddenly won't draw buffers?

Started by
3 comments, last by Mariusz Pilipczuk 9 years, 6 months ago

Hello, I was trying to solve this problem for the past 2 days and I really don't understand what's going on.

I have a Model class, which loads a list of vertices including their positions, texture coordinates, and normals (the normals are not used yet). The constructor for the class is as follows:


Model::Model(const Model::Vertex *vertices, const int count) : vertexCount(count)
{
	if (count == 0)
	{
		ApocFail("Empty model specified!");
	};

	if ((count % 3) != 0)
	{
		ApocFail("The specified model is not triangulated!");
	};

	GLint attrVertex, attrTexCoords, attrNormal;
	apocRenderHandler->getAttrLocations(attrVertex, attrTexCoords, attrNormal);

	glGenBuffers(1, &vbo);
	glBindBuffer(GL_ARRAY_BUFFER, vbo);
	glBufferData(GL_ARRAY_BUFFER, sizeof(Model::Vertex)*count, vertices, GL_DYNAMIC_DRAW);

	glGenVertexArrays(1, &vao);
	glBindVertexArray(vao);

	glEnableVertexAttribArray(attrVertex);
	glEnableVertexAttribArray(attrTexCoords);
	//glEnableVertexAttribArray(attrNormal);

	glVertexAttribPointer(attrVertex, 4, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*) offsetof(Model::Vertex, pos));
	glVertexAttribPointer(attrTexCoords, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*) offsetof(Model::Vertex, texCoords));
	//glVertexAttribPointer(attrNormal, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (void*) offsetof(Model::Vertex, normal));

	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindVertexArray(0);
	cout << "GL error: " << glGetError() << endl;
};

It also has a draw() function, which looks like this:


void Model::draw()
{
	glBindBuffer(GL_ARRAY_BUFFER, vbo);
	glBindVertexArray(vao);
	glDrawArrays(GL_TRIANGLES, 0, vertexCount);
	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindVertexArray(0);
};

When the class is constructed, "GL error" is GL_NO_ERROR, but if i uncomment the line that loads normals, it is GL_INVALID_VALUE (I don't know if that should happen). Drawing also generates no error. However, the model (which is of a cube), does not appear on the screen. It has worked before, but stopped after I tried implementing an entity system, and I'm not sure why.

draw() is called between a glClear(), a texture binding, and a glFlush() (followed by swap windows). The context is created with SDL and GLEW.

The vertex shader:


#version 150
uniform mat4 uModelMatrix;
uniform mat4 uViewMatrix;
uniform mat4 uProjectionMatrix;

in vec4 inVertex;
in vec2 inTexCoords;
in vec3 inNormal;

out vec2 passTexCoords;
out vec3 passNormal;

void main()
{
	passTexCoords = inTexCoords;
	passNormal = inNormal;
	gl_Position = uProjectionMatrix * uViewMatrix * uModelMatrix * inVertex;
};

The fragment shader:


#version 150
uniform sampler2D uSampler;

in vec2 passTexCoords;
in vec3 passNormal;

out vec4 outColor;

void main()
{
	outColor = texture(uSampler, passTexCoords);
	//outColor = vec4(1.0, 0.0, 0.0, 1.0);
};

I am completely confused. All help appreciated :)

Advertisement

I think in your model constructor, you need to bind your VBO after you have bound your VAO. The "current VBO" binding is associated with the currently bound VAO; it is not global state. As a result, your calls to glVertexAttribPointer aren't actually referring to any VBOs at all.

[source=c++]glGenVertexArrays(1, &vao);

glBindVertexArray(vao);

glGenBuffers(1, &vbo);

glBindBuffer(GL_ARRAY_BUFFER, vbo);

glBufferData(GL_ARRAY_BUFFER, sizeof(Model::Vertex)*count, vertices, GL_DYNAMIC_DRAW);

glEnableVertexAttribArray(attrVertex);

glEnableVertexAttribArray(attrTexCoords);[/source]

Something like that should work.

As an extra, after you've bound the VBO there's no need to unbind it afterwards, or to rebind it before (or after) binding the VAO again.

I think in your model constructor, you need to bind your VBO after you have bound your VAO. The "current VBO" binding is associated with the currently bound VAO; it is not global state. As a result, your calls to glVertexAttribPointer aren't actually referring to any VBOs at all.

[source=c++]glGenVertexArrays(1, &vao);

glBindVertexArray(vao);

glGenBuffers(1, &vbo);

glBindBuffer(GL_ARRAY_BUFFER, vbo);

glBufferData(GL_ARRAY_BUFFER, sizeof(Model::Vertex)*count, vertices, GL_DYNAMIC_DRAW);

glEnableVertexAttribArray(attrVertex);

glEnableVertexAttribArray(attrTexCoords);[/source]

Something like that should work.

As an extra, after you've bound the VBO there's no need to unbind it afterwards, or to rebind it before (or after) binding the VAO again.

I have changed it now and it still does not work. I'm sure the matrices are correct because it worked before.

My guess would be that the shader compiler is optimizing out the 'inNormal' attribute resulting in glEnableVertexAttribArray raising invalid value (-1 for an attribute index).

New C/C++ Build Tool 'Stir' (doesn't just generate Makefiles, it does the build): https://github.com/space222/stir

My guess would be that the shader compiler is optimizing out the 'inNormal' attribute resulting in glEnableVertexAttribArray raising invalid value (-1 for an attribute index).

I think so too, but that shouldn't stop the model from being rendered.

This topic is closed to new replies.

Advertisement