glDrawElements isn't drawing anything

Started by
7 comments, last by kennycastro007 9 years, 8 months ago

Whenever I try to draw a mesh using glDrawElements, my program gets no errors, but it doesn't draw anything. I'm pretty sure that the issue is with glDrawElements since if I comment glDrawElements out and replace it with glDrawArrays, it works perfectly. So, my question is, can anyone help me figure out why this is happening?

Advertisement

You cant just replace it with DrawArrays, they work differently. DrawArrays needs a vertex buffer(or more), DrawElements also needs an index buffer.

https://www.khronos.org/opengles/sdk/docs/man/xhtml/glDrawElements.xml

I'm aware that the same parameters can't be sent through both functions, what I mean by "replace" is that I commented out glDrawArrays then I called glDrawElements

glDrawArrays(GL_TRIANGLES, 0, 6);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_INT, m_elements);
m_elements is just a GLuint pointer to the indices array. Also, thanks for taking the time to help me with this little "bug" of mine
Also glGetError returns 1280 on the very first frame but then 0 on every other frame. This gets returned whether or not glDrawArrays or glDrawElements is drawn and also shows up whether or not something gets drawn to the screen

1280 is GL_INVALID_ENUM.

"An unacceptable value is specified for an enumerated argument. The offending command is ignored and has no other side effect than to set the error flag."

https://www.opengl.org/sdk/docs/man/docbook4/xhtml/glGetError.xml

Debug which function actually emits this. By the first frame you mean it happens after the buffer initializations, right?

Im not sure, but depending on how you use openGL the last parameter in glDrawElements can be either a buffer or a buffer offset.

https://stackoverflow.com/questions/9431923/using-an-offset-with-vbos-in-opengl

In any case, ...use google a lot. Its faster than asking on a forum. If you still cant find the solution, consider showing the code.

Thanks for the information about the last parameter being either a buffer or a buffer offset, but even with that, the error still occurs.

void Mesh::Draw()
{
glBindVertexArray(vao);
//glDrawArrays(GL_TRIANGLES, 0, 6);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, ebo);
glDrawElements(GL_TRIANGLES, 6, GL_UNSIGNED_SHORT, NULL);
GLenum error = glGetError();
std::cout << error << std::endl;
glBindVertexArray(0);
}
and that gets called in the main loop of this function, and I've already isolated it to be a problem with glDrawElements, or at least everything works fine until its called. Also, by the first frame, I mean the first time this function gets called in the main loop. Since the error code is being outputted to the console, I was able to see the that the 1280 error only came up in the first call of this function

glDrawElements errors:

-GL_INVALID_ENUM is generated if mode is not an accepted value.
-GL_INVALID_ENUM is generated if type is not GL_UNSIGNED_BYTE or GL_UNSIGNED_SHORT.

....so your glDrawElements call seems to be OK. "Isolated it to be the problem" and "everything works fine" ...these are rather vague statements. Fix your gl-error first, and see if the empty screen remains. In case you dont know, most gl functions can generate errors, and those errors stay there until the next call to a glGetError(). So if you just test the rendering function for errors but not the init function, ...then an error generated in the init function stays there until the first call to glGetError. ...after that call it is removed and you dont get errors in the next frames.

Not that I mind it, but why did you downvote my first post?

Alright, thanks for telling me about the errors staying until glGetError is called, and that helped me isolate the cause of the 1280 error but turns out it still doesn't solve my error. I've checked after I fixed that error and there are no more errors, at least no more that are called by glGetError, the program still doesn't display anything to the screen. Also, I actually didn't vote your post down, I was the one who voted it up, when I checked it was already voted down, so I voted it back up :P

You're using, GL_UNSIGNED_INT, m_elements; but it's common to use unsigned shorts (16-bit indices). Are you sure you didn't mean GL_UNSIGNED_SHORT?

I changed the code for that soon after I posted this, and the updated code is posted in the thread already

This topic is closed to new replies.

Advertisement