Jump to content
  • Advertisement
Sign in to follow this  
paulg568

OpenGL texture coordinates problem

This topic is 2531 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everyone this is my first post here, I think I am in the right area.

I am working on making my own opengl graphics engine. I am mostly trying to learn opengl.

To start off, [color=#363534][font=Verdana,]I downloaded the qt obj viewer a while ago from this link http://qt.gitorious....abs/modelviewer[/font][color=#363534][font=Verdana,]. I have been modifying it to fit my needs. It is my first time using opengl. Right now I am working on texturing objects. I cannot tell if it is my opengl code that is wrong or if I am writing my u v coordinates incorrectly. As it stands I write the u v coords in the same order as they are in the obj file. It appears that part of my texture is correct. Here is an image and below is my code.[/font]

[color=#363534][font=Verdana,]As a side note, reading obj files with multiple objects and materials has worked just fine. The next step is texturing which is giving me some trouble.[/font]

[color=#363534][font=Verdana,]jMMqv.png[/font]

[color=#363534][font=Verdana,]As you can see the u v coords are going into a class Point3d. The u_v_coords is a QVector<Point3d>[/font]



[color=#363534][font=Verdana,]
...
else if (id == "vt"){
Point3d p;
for(int i = 0; i < 2; ++i){
ts >> p;
}
u_v_coords << p;
}
...
[/font]



[color=#363534][font=Verdana,]The opengl code for drawing a textures is below. I am using glTexCoordPointer with the stride of size Point3d.[/font]




void Model::renderHelper(int i) const{
if(texture != NULL){
glEnable( GL_TEXTURE_2D );
glBindTexture( GL_TEXTURE_2D, texture->texture );
glTexCoordPointer(2, GL_FLOAT, sizeof(Point3d), u_v_coords.data());
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
}
glMaterialfv(GL_FRONT, GL_AMBIENT, materials[objectMaterialInt].Ka);
glMaterialfv(GL_FRONT, GL_DIFFUSE, materials[objectMaterialInt].Kd);
glMaterialfv(GL_FRONT, GL_SPECULAR, materials[objectMaterialInt].Ks);
glMaterialf(GL_FRONT, GL_SHININESS, materials[objectMaterialInt].Ns);
glShadeModel(GL_SMOOTH);
glPushMatrix();
glTranslatef(position[0], position[1], position[2]);
glEnableClientState(GL_NORMAL_ARRAY);
glVertexPointer(3, GL_FLOAT, 0, (float *)m_points.data());
glNormalPointer(GL_FLOAT, 0, (float *)m_normals.data());
glDrawElements(GL_TRIANGLES, objects.size(), GL_UNSIGNED_INT, objects.data());
glPopMatrix();
if(texture != NULL){
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
}
}



Thanks for the help, let me know if you need more information.

Share this post


Link to post
Share on other sites
Advertisement
Hidden
I think usage of glDrawElements is wrong for WaveFront OBJ.

Let's take a look at possible .obj data: (for example)
vt 1.0 0.0 0.0
vt 0.0 0.0 0.0
<< there is possible different sizes between vertex and texture coordinate arrays
v 10.0 20.0 30.0
v 20.0 10.0 30.0
v 30.0 20.0 10.0
v 10.0 10.0 30.0

f 1/1 2/1 3/1 << this means that indexes to vertex array and to texture coordinate array can be different (which OpenGL doesn't support for vbos by glDrawElements yet)
f 1/2 3/2 4/2<p>

Share this post


Link to post
Hidden
I think usage of glDrawElements is wrong for WaveFront OBJ.

Let's take a look at possible .obj data: (for example)
vt 1.0 0.0 0.0
vt 0.0 0.0 0.0
<< there is possible different sizes between vertex and texture coordinate arrays
v 10.0 20.0 30.0
v 20.0 10.0 30.0
v 30.0 20.0 10.0
v 10.0 10.0 30.0

f 1/1 2/1 3/1 << this means that indexes to vertex array and to texture coordinate array can be different (which OpenGL doesn't support for vbos by glDrawElements yet)
f 1/2 3/2 4/2

So, you need to re-triangulate (to cull indices of your mesh) and finally draw with glDrawArrays...

Best wishes, FXACE.

Share this post


Link to post
I think usage of glDrawElements is wrong for WaveFront OBJ.

Let's take a look at possible .obj data: (for example)
vt 1.0 0.0 0.0
vt 0.0 0.0 0.0
<< there is possible different sizes between vertex and texture coordinate arrays
v 10.0 20.0 30.0
v 20.0 10.0 30.0
v 30.0 20.0 10.0
v 10.0 10.0 30.0
f 1/1 2/1 3/1 << this means that indexes to vertex array and to texture coordinate array can be different (which OpenGL doesn't support for vbos by glDrawElements yet)
f 1/2 3/2 4/2

So, you need to re-triangulate (to cull indices of your mesh) and finally draw with glDrawArrays...
Best wishes, FXACE.

Share this post


Link to post
Share on other sites
Just a small update, I tried to different wavefrot obj files, and in the one there are more uv coords than vertices and in the other there are more vertices than uv coords. So I don't know if retriangulating is the answer, maybe it is, I am not totally sure how to do that.

Also some of my texture coordinates are not normalized. Most of them are between 0 and 1 some are as high as 20 and low as -20. I tried normalizing them all but it didn't fix the texturing issue.

Share this post


Link to post
Share on other sites

Just a small update, I tried to different wavefrot obj files, and in the one there are more uv coords than vertices and in the other there are more vertices than uv coords. So I don't know if retriangulating is the answer, maybe it is, I am not totally sure how to do that.



For example, you have this:

// quadratic
vec3 vertices[] = {
vec3(-10,0,-10),
vec3( 10,0,-10),
vec3( 10,0, 10),
vec3(-10,0, 10)
};

vec2 texcoords[] = {
vec2(0,0),
vec2(1,0),
vec2(1,1),
vec2(0,1)
};

unsigned int indices [] = {
0, 1, 2,
0, 2, 3
};


Then you need to do this:

vec3 *linked_vertices = new vec3 [6 /* # of indices your mesh holds */];
vec2 *linked_tc = new vec2 [6 /* # of indices your mesh holds */];
for(unsigned int i = 0; i< 6; i+=3) // 6 ^ , 3 - just stride (per triangle, not important)
{
// copies triangle
linked_vertices = vertices[indices];
linked_vertices[i+1]= vertices[indices[i+1]];
linked_vertices[i+2]= vertices[indices[i+2]];

linked_tc = texcoords[indices];
linked_tc[i+1] = texcoords[indices[i+1]];
linked_tc[i+2] = texcoords[indices[i+2]];
}

// And finaly
glVertexPointer(...);
glTexCoordPointer(...);

glDrawArrays(GL_TRIANGLES, 0, 6); // '6' - see above.





Also some of my texture coordinates are not normalized. Most of them are between 0 and 1 some are as high as 20 and low as -20. I tried normalizing them all but it didn't fix the texturing issue.


Texture coordinates are not vectors like "normal", "tangent", etc. If your mesh contains these values so that was planned. If did you know that uvw mapping have different modes (wrapings) to read texture's pixel from input texture coordinates (uvw) : GL_CLAMP (GL_CLAMP_TO_BORDER), GL_REPEAT, GL_MIRRORED_REPEAT, GL_CLAMP_TO_EDGE. For meshes, by default, GL_REPEAT feature is used.

Best wishes, FXACE.

Share this post


Link to post
Share on other sites
Thank you very much, texturing works now.

Just to be clear in case someone else reads this though, the linked_tc would use the uv index from the obj file not the vertex index.

Another thing that had me stumped for a bit was calculating normals after reorganizing the coordinates. Doing it after doesn't allow you to use adjacent vertices to calculate normals, so then you only have the normal of the face which will stop GL_SMOOTH from working. I had to calculate the normals before reorganization so that you can do it using adjacent vertices. Then when you reorganize the vertices with the indices from the obj file, you also have to reorganize the normal order using the same indices.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!