Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


converting to VBOs with glDrawElements from immediate mode


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 Clancy   Members   -  Reputation: 124

Like
0Likes
Like

Posted 14 February 2013 - 03:14 PM

This draws a box.

 

     glBegin ( GL_LINE_STRIP );
        glVertex2i ( x0, y0 );
        glVertex2i ( x1, y0 );
        glVertex2i ( x1, y1 );
        glVertex2i ( x0, y1 );
        glVertex2i ( x0, y0 );
    glEnd ();
 

Now, to convert it to use VBO I have done

 

 

    GLint vertices[] = {x0,y0,x1,y0,x1,y1,x0,y1,x0,y0 };
    unsigned int indices[] = {0, 1, 2, 3,0};

    glBindBuffer(GL_ARRAY_BUFFER, buffers[0]);
    glBufferData(GL_ARRAY_BUFFER, sizeof(vertices), vertices, GL_DYNAMIC_DRAW);

    glEnableClientState(GL_VERTEX_ARRAY);
    glVertexPointer(2, GL_INT, 0, NULL);
    glDrawElements(GL_LINE_STRIP, 5, GL_UNSIGNED_INT, indices);
    glDisableClientState(GL_VERTEX_ARRAY);
 

 

I must be doing something stupid, since it isn't working.  Instead of a box, it isn't drawing anything visable.

Not getting any GL errors either.

I can toggle the code above to go back to old code, and it works as it should.

 

 

 

Where did I mess up ?


Edited by Clancy, 14 February 2013 - 03:59 PM.


Sponsor:

#2 _Slin_   Members   -  Reputation: 202

Like
0Likes
Like

Posted 14 February 2013 - 06:34 PM

vertices is a pointer which means that sizeof(vertices) will always be 4 or 8, what you want is something like 4*sizeof(GLint).



#3 Brother Bob   Moderators   -  Reputation: 8429

Like
1Likes
Like

Posted 14 February 2013 - 06:40 PM

vertices is a pointer which means that sizeof(vertices) will always be 4 or 8, what you want is something like 4*sizeof(GLint).

No, vertices is an array and sizeof(vertices) correctly returns its size.



#4 _Slin_   Members   -  Reputation: 202

Like
0Likes
Like

Posted 14 February 2013 - 07:02 PM

No, vertices is an array and sizeof(vertices) correctly returns its size.

If you do something like this: GLint vertices[] = {x0y0x1y0x1y1x0y1x0y0};

the value of "vertices" will be the address of the first element and the [index] operator does not do anything else but some pointer arithmetic to access the elements. You can access the first element as *vertices. vertices should have the type GLint* and sizeof should return the according size.

If I am wrong, I will of course accept that, but that would kinda destroy everything I thought understood about programming...

 

Edit: Okay, I googled and tried it myself and actually sizeof returns the correct array size :)


Edited by _Slin_, 14 February 2013 - 07:12 PM.


#5 mhagain   Crossbones+   -  Reputation: 8136

Like
0Likes
Like

Posted 14 February 2013 - 07:16 PM

No, vertices is an array and sizeof(vertices) correctly returns its size.

If you do something like this: GLint vertices[] = {x0y0x1y0x1y1x0y1x0y0};

the value of "vertices" will be the address of the first element and the [index] operator does not do anything else but some pointer arithmetic to access the elements. You can access the first element as *vertices. vertices should have the type GLint* and sizeof should return the according size.

If I am wrong, I will of course accept that, but that would kinda destroy everything I thought understood about programming...

 

Why not printf sizeof (vertices) and find out?  But Brother Bob is right; if it was *vertices it would be different of course.

 

The OP's code looks correct to me, assuming that buffers[0] is a valid buffer object name created via glGenBuffers (if that's not the case then there's the problem).  An outside possibility is a driver bug causing problems with a GL_INT vertex type; maybe try changing that to floats and see if the problem reproduces?  Another test might be to use:

 

glBegin (GL_LINE_STRIP);
glArrayElement (0);
glArrayElement (1);
glArrayElement (2);
glArrayElement (3);
glArrayElement (0);
glEnd ();

That's functionally equivalent tothe glDrawElements call but - just using it as a test - may help to shed some light on what's happening here.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#6 Brother Bob   Moderators   -  Reputation: 8429

Like
0Likes
Like

Posted 14 February 2013 - 07:18 PM

vertices is the array itself, not a pointer to the first element. There are many places where the array can decay into a pointer, which is the primary source of confusion in situations like this. But in the end, an array is not a pointer, nor is a pointer an array. Both can be subscripted, both can be used in a pointer contexts (such as passing an array to a function that expects a pointer, in which the pointer-decay I mentioned takes place), but they are distinct types.

 

Try this for example:

int main()
{
    int foo1[] = {1, 2, 3};
    int *foo2 = foo1;

    std::cout << "type=" << typeid(foo1).name() << ", size=" << sizeof(foo1) << std::endl;
    std::cout << "type=" << typeid(foo2).name() << ", size=" << sizeof(foo2) << std::endl;
}

This is what VS2012 for x64 shows for me:

type=int [3], size=12
type=int * __ptr64, size=8

It shows that the array has the type of three integers and the size of three integers, and that the pointer is of pointer type and is the size of a pointer. The assignment of foo1 to foo2 demonstrates how the array decays into a pointer and is thus a valid assignment.



#7 _Slin_   Members   -  Reputation: 202

Like
0Likes
Like

Posted 14 February 2013 - 07:28 PM

Thank you Brother Bob, I didn´t know that :).



#8 Clancy   Members   -  Reputation: 124

Like
0Likes
Like

Posted 14 February 2013 - 07:58 PM

The OP's code looks correct to me, assuming that buffers[0] is a valid buffer object name created via glGenBuffers (if that's not the case then there's the problem). An outside possibility is a driver bug causing problems with a GL_INT vertex type; maybe try changing that to floats and see if the problem reproduces? Another test might be to use:



glBegin (GL_LINE_STRIP);
glArrayElement (0);
glArrayElement (1);
glArrayElement (2);
glArrayElement (3);
glArrayElement (0);
glEnd ();

That's functionally equivalent tothe glDrawElements call but - just using it as a test - may help to shed some light on what's happening here.

glGenBuffers I use is correct, not really sure how anyone can screw that up. ;)

 

However, using your little example with glArrayElement  *does* work.

 

So, I guess I am chasing a driver issue of some kind... though, I need to get back to my main machine to test, instead of this crappy laptop that has intel HD3000 in it.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS