glDrawElements
glDrawElements gives me an exception. All parameters are accessible. I have no idea what to do and where to start.
Have you enabled any vertex attributes for which you have not set pointer (glVertexAttribPointer) or maybe left it to point to wrong data from previous draw call?
II have 64byte vertex data (11 floats + 5 floats free) :
0x071FB268 0.11587500 -0.54674762 3.4832630
0x071FB274 0.00000000 -0.50959808 0.86037791
0x071FB280 -0.070648283 -0.85824698 -0.50835121
0x071FB28C 0.86584002 -0.87152898 -4.3160208e+008
0x071FB298 -4.3160208e+008 -4.3160208e+008 -4.3160208e+008
0x071FB2A4 -4.3160208e+008
end of vertex data [810]
(adress + 809*64) :
0x07207CA8 0.19864389 0.00000000 -0.46415800
0x07207CB4 0.35193941 0.00000000 -0.93600267
0x07207CC0 0.37401798 0.89587212 0.23984091
0x07207CCC 0.73016000 -0.69472802 -4.3160208e+008
0x07207CD8 -4.3160208e+008 -4.3160208e+008 -4.3160208e+008
0x07207CE4 -4.3160208e+008
Than i draw that like this :
Indices (in memory) :
0x00686FE8 0 1 2 0 2 3 4
0x00686FF6 5 1 1 0 4 6 0
...
0x0068894C 713 805 800 807 808 800 808
0x0068895A 801 791 789 807 791 807 809
None of indices is greater than 809.
0x071FB268 0.11587500 -0.54674762 3.4832630
0x071FB274 0.00000000 -0.50959808 0.86037791
0x071FB280 -0.070648283 -0.85824698 -0.50835121
0x071FB28C 0.86584002 -0.87152898 -4.3160208e+008
0x071FB298 -4.3160208e+008 -4.3160208e+008 -4.3160208e+008
0x071FB2A4 -4.3160208e+008
end of vertex data [810]
(adress + 809*64) :
0x07207CA8 0.19864389 0.00000000 -0.46415800
0x07207CB4 0.35193941 0.00000000 -0.93600267
0x07207CC0 0.37401798 0.89587212 0.23984091
0x07207CCC 0.73016000 -0.69472802 -4.3160208e+008
0x07207CD8 -4.3160208e+008 -4.3160208e+008 -4.3160208e+008
0x07207CE4 -4.3160208e+008
glGenBuffers(1,&BufferID);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER,BufferID);
glBufferData(GL_ELEMENT_ARRAY_BUFFER,sizeof(VertexData)*data_count,data,usage);
delete[] data;
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_NORMAL_ARRAY);
glVertexPointer(3,GL_FLOAT,sizeof(VertexData),(void*)(sizeof(float)*0));
glNormalPointer(GL_FLOAT,sizeof(VertexData),(void*)(sizeof(float)*3));
glVertexAttribPointer(1,3,GL_FLOAT,0,sizeof(VertexData),(void*)(sizeof(float)*6));
glClientActiveTexture(GL_TEXTURE0);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer(2,GL_FLOAT,sizeof(VertexData),(void*)(sizeof(float)*9));
glEnableVertexAttribArray(1);
Than i draw that like this :
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, BufferID);
glDrawElements(GL_TRIANGLES,Index_Count,GL_UNSIGNED_SHORT,Indices);
Indices (in memory) :
0x00686FE8 0 1 2 0 2 3 4
0x00686FF6 5 1 1 0 4 6 0
...
0x0068894C 713 805 800 807 808 800 808
0x0068895A 801 791 789 807 791 807 809
None of indices is greater than 809.
Are you using GL_ARRAY_BUFFER for the vertex data and GL_ELEMENT_ARRAY_BUFFER for the index data? It looks to me that you're using GL_ELEMENT_ARRAY_BUFFER for both.
Edit: Also:
1) Why are you setting the data for glVertexAttribPointer() before enabling that attribute?
2) Are you binding and index buffer then sending a pointer to the indices in your glDrawElements() call? That will probably cause a crash because when you bind an IBO, any address you put in glDrawElements() will be based off the IBO, not client memory.
3) Why are you using GL_UNSIGNED_SHORT? Is there only 256 vertices in your geometry you're trying to render in this call?
Edit: Also:
1) Why are you setting the data for glVertexAttribPointer() before enabling that attribute?
2) Are you binding and index buffer then sending a pointer to the indices in your glDrawElements() call? That will probably cause a crash because when you bind an IBO, any address you put in glDrawElements() will be based off the IBO, not client memory.
3) Why are you using GL_UNSIGNED_SHORT? Is there only 256 vertices in your geometry you're trying to render in this call?
I'm pretty new to VBOs. I have glBuffer only with vertex data (GL_ELEMENT_ARRAY_BUFFER) than i have "unsigned int *Indices". Could you please give me some good tutorial?
P.S. I'm pretty sure that unsigned short is 0-65535
edit : I've found yours tutorial ;)
P.S. I'm pretty sure that unsigned short is 0-65535
edit : I've found yours tutorial ;)
But some things i don't really get... when I'm drawing :
1) Why i'm binding IBO, why not VBO (is it possible to discard VBO id after generating buffer?)
2) Why is const GLvoid *indices of glDrawElements set to NULL?
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO_ID);
glDrawElements(GL_TRIANGLES,Index_Count,GL_UNSIGNED_SHORT,NULL);
1) Why i'm binding IBO, why not VBO (is it possible to discard VBO id after generating buffer?)
2) Why is const GLvoid *indices of glDrawElements set to NULL?
But some things i don't really get... when I'm drawing :
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, IBO_ID);
glDrawElements(GL_TRIANGLES,Index_Count,GL_UNSIGNED_SHORT,NULL);
1) Why i'm binding IBO, why not VBO (is it possible to discard VBO id after generating buffer?)
2) Why is const GLvoid *indices of glDrawElements set to NULL?
1) At least in my tutorial, both the VBO and IBO are being bound. Learn the OpenGL state machine well: you you bind something that means any operation you perform after it uses the currently bound object.
2) Pasted directly from my VBO Tutorial
Now when we call glVertexPointer() to set our vertex data, it will get it from the memory of our VBO. Since we want it to get the vertex data from the start of the VBO, we give it an address of NULL (which is equal to 0).[/quote]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement