I am working on an opengl project and I have been running into access violation errors. I managed to isolate the error into the code below, but I do not understand why it errors. Any help would be appreciated.
BTW: I know that the genbuffers and deletebuffers are used in a dumb way. The code below just replicates the error.
#include <gl/glew.h>
#include <gl/glut.h>
void draw()
{
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glEnableClientState(GL_VERTEX_ARRAY);
float v[]={
0,0,0,
1,1,1,
2,2,2,
};
unsigned char i[]={
0,1,2
};
unsigned int a;
unsigned int b;
glGenBuffers(1,&a);
glGenBuffers(1,&b);
glBindBuffer(GL_ARRAY_BUFFER,a);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,b);
glBufferData(GL_ARRAY_BUFFER,36,v,GL_STATIC_DRAW);
glBufferData(GL_ELEMENT_ARRAY_BUFFER,3,i,GL_STATIC_DRAW);
glVertexPointer(3,GL_FLOAT,0,0);
glDrawElements(GL_TRIANGLES,3,GL_UNSIGNED_BYTE,0);
glDeleteBuffers(1,&a);
glDeleteBuffers(1,&b);
glBegin(GL_LINES);
glVertex3f(0,0,0);
glVertex3f(0,0,0);
glEnd();
glutSwapBuffers();
}
int window;
int width,height;
void changeSize(int w, int h)
{
if(h == 0)
h = 1;
double ratio = 1.0* w / h;
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glViewport(0, 0, w, h);
gluPerspective(45,ratio,.01f,1000);
glMatrixMode(GL_MODELVIEW);
width = w;
height = h;
}
void initWindow()
{
glEnable(GL_CULL_FACE);
glDepthFunc(GL_LEQUAL);
glEnable(GL_DEPTH_TEST);
glClearColor(0.1f,0.1f,0.1f,1);
glEnableClientState(GL_VERTEX_ARRAY);
}
int main(int argc, char **argv)
{
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_DEPTH | GLUT_DOUBLE | GLUT_RGBA);
glutInitWindowSize(800,600);
window = glutCreateWindow("OpenGl");
glewInit();
initWindow();
I only had time for a quick glance, but it looks like you're assuming floats are 4 bytes, and unsigned chars are 1 byte. Seems reasonable but are you sure?
Also, the order in which you're calling bindBuffer and bufferData makes me uncomfortable. Maybe try:
because I think you are supposed to bind a given buffer immediately before calling bufferData. I don't know for sure since they're element, and vertex_array buffers, but it might be worht a shot.
I also think its weird that you are creating the buffers, filling them, rendering them, and then deleting the buffers all in one function. I assume this is some sort of test code to get things working but that kind defeats the point of having VBOs. You should have one function initVBOs() that puts the data in the bufffers and then leaves them there. And your draw() function should just bind the buffers, setup the attribute pointers, and then call glDrawElements. I think it would make debugging a bit easier.
I only had time for a quick glance, but it looks like you're assuming floats are 4 bytes, and unsigned chars are 1 byte. Seems reasonable but are you sure?
Yes, I checked.
Also, the order in which you're calling bindBuffer and bufferData makes me uncomfortable. Maybe try:
because I think you are supposed to bind a given buffer immediately before calling bufferData. I don't know for sure since they're element, and vertex_array buffers, but it might be worht a shot.
didnt' work :[
I also think its weird that you are creating the buffers, filling them, rendering them, and then deleting the buffers all in one function. I assume this is some sort of test code to get things working but that kind defeats the point of having VBOs. You should have one function initVBOs() that puts the data in the bufffers and then leaves them there. And your draw() function should just bind the buffers, setup the attribute pointers, and then call glDrawElements. I think it would make debugging a bit easier.
Yeah, i'm doing that in the project code. But the error seems heap corruption somewhere in the code above.
The problem also goes away if I use shorts instead of chars for indices...?
Tested the code here on Linux and seems to run just fine.