Archived

This topic is now archived and is closed to further replies.

ijc

Texture coord buffer using elements

Recommended Posts

Hi, I''m having problems with specifying texture coordinate arrays for the element array rather than vertex array. In all the tutorials I''ve studied, each vertex has one set of texture coordinates. I reference each vertex by ID to form elements of quads or triangles, and so I want to specify one set of tex coords per vertex reference. How do go about doing this? Thanks, IJC

Share this post


Link to post
Share on other sites
The best way is to consolidate the mesh, so that each vertex specifies a unique combination of attributes, which could be position, vertex, tex coord, colour and normal (whichever of those you''re using), and store those attributes in each vertex.

For example, if you have a cube mesh with 8 vertices, and you have 6 quads which specify 4 point normals and 4 vertex IDs per quad. To consolidate, you''d loop through each quad in the mesh, building a list of vertices with a unique combination of attributes. Since the cube isn''t smooth shaded, all 4 vertices connected to each quad should be unique, because the normals aren''t shared between the unconsolidated vertices. You should end up, in this case, with 24 vertices (4 per quad), each with their own position and normal data. That is a worst case scenario BTW.

If you want to do this, it''s important to have a method mapping the original vertex indices to the new vertex indices so that you can update the primitive indices. I use a hash table for this.

Share this post


Link to post
Share on other sites
Ok, but how do you then set the vertex buffer pointers using glVertexPointer(...); and glTexCoordPointer(..);
then render the buffer contents with
glDrawElements(...)?

Having a non-zero stride value would allow the vertex ids, normals, tex coords etc to be consolidated into a struct or class but the command glTexCoordPointer will look for the actual tex coords rather than an ID.

Share this post


Link to post
Share on other sites
glClientActiveTexture (GL_TEXTURE0_ARB);
glEnableClientState (GL_TEXTURE_COORD_ARRAY);
glTexCoordPointer (2, GL_FLOAT, sizeof (MVertex), 0);

glEnableClientState (GL_COLOR_ARRAY);
glColorPointer (4, GL_FLOAT, sizeof (MVertex), OFFSET (2, float));

glEnableClientState (GL_NORMAL_ARRAY);
glNormalPointer (GL_FLOAT, sizeof (MVertex), OFFSET (6, float));

glEnableClientState (GL_VERTEX_ARRAY);
glVertexPointer (3, GL_FLOAT, sizeof (MVertex), OFFSET (9, float));


The stride is simply set to the size of your vertex struct/class. In my code OFFSET is simply a macro for (sizeof(float) * i), which is the relative address of the attribute within the vertex struct/class. The above code is for vertex buffer objects, but for standard vertex arrays you can just add the address of your array of vertices to the offset.

I''m entirely sure what you''re talking about with respect to vertex IDs. . . Vertices in vertex arrays or vertex buffer objects sorta automatically have IDs. When you use glDrawDrawElements, element number 0 is the first vertex in the array, 1 is the second, etc. . . Just like an array.

Share this post


Link to post
Share on other sites