can't get glGenVertexArrays to work.

Started by
9 comments, last by sobeit 11 years ago

hi all,

I'm learning GLSL recently, and trying to use vertex array object. but I just can't get things right.

I use glew for OpenGL setup, and use vertex array to bind vertex attributes. while the compiler doesn't show any complaint, the output doesn't show anything neither. I try to use glsldevil to debug my shader program. it turns out that every time when the program comes to execute glGenVertexArrays(1, m_vao), the error message window pops up saying the program xxxx.exe has stopped working. Show I assume that I did something wrong with glGenvertexArrays.

below is the snippet of my code, how am i gonna fix this? Thanks in advance.


        glGenBuffers(1, &m_vertBO);
	glGenBuffers(1, &m_indexBO);

	glGenVertexArrays(1, &m_vao);

	//vertex array=============================
	glBindVertexArray(m_vao);
	
	//pass in vertex data
	glBindBuffer(GL_ARRAY_BUFFER, m_vertBO);
	glBufferData(GL_ARRAY_BUFFER, sizeofvertdata, tempVertData, GL_STATIC_DRAW);
	
	//pass in index data
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, m_indexBO);
	glBufferData(GL_ELEMENT_ARRAY_BUFFER, sizeofindexdata, tempIndexData, GL_STATIC_DRAW);

	//vertex data
	glEnableVertexAttribArray(vertLoc);
	glVertexAttribPointer(vertLoc, 3, GL_FLOAT, GL_FALSE, 0, 0);
	//texture data
	if (objmodel.hasTexCoord())
	{
		glEnableVertexAttribArray(texCoordLoc);
		glVertexAttribPointer(texCoordLoc, 2, GL_FLOAT, GL_FALSE, 0, (void *)(objmodel.texCoordOffset()));
	}
	//normal data
	if (objmodel.hasNormal())
	{
		glEnableVertexAttribArray(normLoc);
		glVertexAttribPointer(normLoc, 3, GL_FLOAT, GL_FALSE, 0, (void *)(objmodel.normalOffset()));
	}

	//unbind vertex array object
	glBindVertexArray(0);

	//unbind vertex and index buffer object
	glBindBuffer(GL_ARRAY_BUFFER, 0);
	glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);
Advertisement
Is it possible your OpenGL does not support glGenVertexArrays? Is (glGenVertexArrays != nullptr) true? What kind of OpenGL context are you creating?

What does your debugger tell you?

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

What does your debugger tell you?

you mean glsldevil or the compiler(I'm using VS)? The visual studio didn't show any error. but when I used glsldevil to execute opengl commands step by step, the program crashed as soon as glGenVertexArrays was being executed.

Is it possible your OpenGL does not support glGenVertexArrays? Is (glGenVertexArrays != nullptr) true? What kind of OpenGL context are you creating?

I don't think so. I just checked (glGenVertexarrays != nullptr), it's true.

I don't know what you mean by "What kind of OpenGL context". my opengl version is 4.2.0 and I'm working on Windows and Visual Studio.

What are you using to load extensions?

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

What are you using to load extensions?

glew


        GLenum glewErr = glewInit();
	if (GLEW_OK != glewErr)
	{
		std::cerr << "Failed to initialize GLEW." << std::endl;
		std::cerr << glewGetErrorString(glewErr) << std::endl;
	}

OK, first question, and apologies in advance for this but it is the most common mistake people make with GLEW: can you confirm that you're calling glewInit () after creating your GL context? I don't just mean the temporary context you may be creating in order to access wglCreateContextAttribsARB, I mean the final real context that you've created and made current.

Second question is: can you test "if (GLEW_ARB_vertex_array_object)" and see what it gives you?

Third question, and it's a long-shot but we've seen it recently and it comes to mind: you're not by any chance on a laptop with NVIDIA optimus or AMD switchable graphics? If so you may be falling back to running GL from the Intel GPU. Can you confirm what you get for "glGetString (GL_VENDOR)", "glGetString (GL_RENDERER)" and "glGetString (GL_VERSION)"?

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

OK, first question, and apologies in advance for this but it is the most common mistake people make with GLEW: can you confirm that you're calling glewInit () after creating your GL context? I don't just mean the temporary context you may be creating in order to access wglCreateContextAttribsARB, I mean the final real context that you've created and made current.

Second question is: can you test "if (GLEW_ARB_vertex_array_object)" and see what it gives you?

Third question, and it's a long-shot but we've seen it recently and it comes to mind: you're not by any chance on a laptop with NVIDIA optimus or AMD switchable graphics? If so you may be falling back to running GL from the Intel GPU. Can you confirm what you get for "glGetString (GL_VENDOR)", "glGetString (GL_RENDERER)" and "glGetString (GL_VERSION)"?

thank you for helping here.

question 1: yes, I did call glewInit() after creating GL context.

question2: I call glewIsSupported("GL_ARB_vertex_array_object"), it returns true. GLEW_ARB_vertex_array_object is also true.

question3: yes, my laptop runs on 2 graphics NVIDIA and INTEL. glGetString(GL_VENDOR) is NVIDIA Corporation; glGetString(GL_RENDERER) gets GeForce GT 635M/PCIe/SSE2; and glGetString(GL_VERSION) gets 4.2.0

Do you test these just after creating your context or just before calling glGenVertexArrays? If the former I'd suggest moving to the latter and re-checking; I suspect that your program is going back to the Intel at some point in it's execution.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

This topic is closed to new replies.

Advertisement