Rendering faces
I''m loading and rendering ASE files, the only problem is that there are more texture coordinates than there are vertices listed in the mesh definition. there are as many texture coordinates as there are faces...so im trying to set up a class to encapsulate all that..the only problem is, that i cant directly match up all the vertices with texture coordinates unless i increase the number of vertices...which is something i dont want to do. Is there anyway around this? Any ideas on how dynamically change what texture coordinates belong to a vertex? Thanks for the help
Another question about this..why would there be more texture coordinates listed than there are vertices?
There might be more texture coordinates then there are verticies because a the texture on two faces might not meet at the same place on the vertexe. In other words two neigboring faces might not share neighboring texture coordinates.
When you set up your vertex buffer you are going to have to use something like:
struct
{
D3DVECTOR p; //xyz
D3DVECTOR n; //the normal
float u0, v0; //the texture coords
};
for every vertex. so you might as well read the data in to something like that anyway.
You cant really use seperate arrays for each attribute unless you want to mess with writing your own vertex shader, which I wouldn''t advise for this situation.
When you set up your vertex buffer you are going to have to use something like:
struct
{
D3DVECTOR p; //xyz
D3DVECTOR n; //the normal
float u0, v0; //the texture coords
};
for every vertex. so you might as well read the data in to something like that anyway.
You cant really use seperate arrays for each attribute unless you want to mess with writing your own vertex shader, which I wouldn''t advise for this situation.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement