converting to VBOs with glDrawElements from immediate mode

Started by
6 comments, last by Clancy 11 years, 2 months ago

This draws a box.


     glBegin ( GL_LINE_STRIP );
        glVertex2i ( x0, y0 );
        glVertex2i ( x1, y0 );
        glVertex2i ( x1, y1 );
        glVertex2i ( x0, y1 );
        glVertex2i ( x0, y0 );
    glEnd ();
 

Now, to convert it to use VBO I have done


    GLint vertices[] = {x0,y0,x1,y0,x1,y1,x0,y1,x0,y0 };
    unsigned int indices[] = {0, 1, 2, 3,0};

    glBindBuffer(GL_ARRAY_BUFFER, buffers[0]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_DYNAMIC_DRAW);

    glEnableClientState(GL_VERTEX_ARRAY);
    glVertexPointer(2, GL_INT, 0, NULL);
    glDrawElements(GL_LINE_STRIP, 5, GL_UNSIGNED_INT, indices);
    glDisableClientState(GL_VERTEX_ARRAY);
 

I must be doing something stupid, since it isn't working. Instead of a box, it isn't drawing anything visable.

Not getting any GL errors either.

I can toggle the code above to go back to old code, and it works as it should.

Where did I mess up ?

Advertisement

vertices is a pointer which means that sizeof(vertices) will always be 4 or 8, what you want is something like 4*sizeof(GLint).

vertices is a pointer which means that sizeof(vertices) will always be 4 or 8, what you want is something like 4*sizeof(GLint).

No, vertices is an array and sizeof(vertices) correctly returns its size.

No, vertices is an array and sizeof(vertices) correctly returns its size.

If you do something like this: GLint vertices[] = {x0, y0, x1, y0, x1, y1, x0, y1, x0, y0};

the value of "vertices" will be the address of the first element and the [index] operator does not do anything else but some pointer arithmetic to access the elements. You can access the first element as *vertices. vertices should have the type GLint* and sizeof should return the according size.

If I am wrong, I will of course accept that, but that would kinda destroy everything I thought understood about programming...

Edit: Okay, I googled and tried it myself and actually sizeof returns the correct array size :)

No, vertices is an array and sizeof(vertices) correctly returns its size.

If you do something like this: GLint vertices[] = {x0, y0, x1, y0, x1, y1, x0, y1, x0, y0};

the value of "vertices" will be the address of the first element and the [index] operator does not do anything else but some pointer arithmetic to access the elements. You can access the first element as *vertices. vertices should have the type GLint* and sizeof should return the according size.

If I am wrong, I will of course accept that, but that would kinda destroy everything I thought understood about programming...

Why not printf sizeof (vertices) and find out? But Brother Bob is right; if it was *vertices it would be different of course.

The OP's code looks correct to me, assuming that buffers[0] is a valid buffer object name created via glGenBuffers (if that's not the case then there's the problem). An outside possibility is a driver bug causing problems with a GL_INT vertex type; maybe try changing that to floats and see if the problem reproduces? Another test might be to use:


glBegin (GL_LINE_STRIP);
glArrayElement (0);
glArrayElement (1);
glArrayElement (2);
glArrayElement (3);
glArrayElement (0);
glEnd ();

That's functionally equivalent tothe glDrawElements call but - just using it as a test - may help to shed some light on what's happening here.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

vertices is the array itself, not a pointer to the first element. There are many places where the array can decay into a pointer, which is the primary source of confusion in situations like this. But in the end, an array is not a pointer, nor is a pointer an array. Both can be subscripted, both can be used in a pointer contexts (such as passing an array to a function that expects a pointer, in which the pointer-decay I mentioned takes place), but they are distinct types.

Try this for example:

int main()
{
    int foo1[] = {1, 2, 3};
    int *foo2 = foo1;

    std::cout << "type=" << typeid(foo1).name() << ", size=" << sizeof(foo1) << std::endl;
    std::cout << "type=" << typeid(foo2).name() << ", size=" << sizeof(foo2) << std::endl;
}

This is what VS2012 for x64 shows for me:

type=int [3], size=12
type=int * __ptr64, size=8

It shows that the array has the type of three integers and the size of three integers, and that the pointer is of pointer type and is the size of a pointer. The assignment of foo1 to foo2 demonstrates how the array decays into a pointer and is thus a valid assignment.

Thank you Brother Bob, I didn´t know that :).

The OP's code looks correct to me, assuming that buffers[0] is a valid buffer object name created via glGenBuffers (if that's not the case then there's the problem). An outside possibility is a driver bug causing problems with a GL_INT vertex type; maybe try changing that to floats and see if the problem reproduces? Another test might be to use:



glBegin (GL_LINE_STRIP);
glArrayElement (0);
glArrayElement (1);
glArrayElement (2);
glArrayElement (3);
glArrayElement (0);
glEnd ();

That's functionally equivalent tothe glDrawElements call but - just using it as a test - may help to shed some light on what's happening here.

glGenBuffers I use is correct, not really sure how anyone can screw that up. ;)

However, using your little example with glArrayElement *does* work.

So, I guess I am chasing a driver issue of some kind... though, I need to get back to my main machine to test, instead of this crappy laptop that has intel HD3000 in it.

This topic is closed to new replies.

Advertisement