# DX11 UV problem with FBX and DX11

This topic is 1548 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Hello All!
Came by here today hoping somebody has some experience loading models into DX11 using the FBX SDK. I am
having a weird problem when I load in my texture coordinates. On some models, they render correctly while on others, they render with the "U" flipped. An example below:

The space ship is rendered correctly, while the planet texture is flipped. Both are using the same code to load them in, the only difference is that I am sending in different strings for where to fetch the file from.

I load in the vertices from FBX in the following manner:

	int counter = 0;
//Loop through the polygons
for (int p = 0; p < fbxMesh->GetPolygonCount(); p++)
{
//Loop through the three vertices within each polygon
for(int v = 0; v < fbxMesh->GetPolygonSize(p); v++)
{

const int controlPointIndex = fbxMesh->GetPolygonVertex(p, v);

VertexStruct vS;
vS.position = FBXD3DVector3(fbxMesh->GetControlPointAt(controlPointIndex));

FbxVector2 tex;
bool isMapped;
fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

FbxVector4 norm;
fbxMesh->GetPolygonVertexNormal(p, v, norm);
vS.normal = D3DXVECTOR3(-norm[0], -norm[1], -norm[2]);

m_Vertices.push_back(vS);

m_Indices.push_back(static_cast<unsigned int>(counter));

counter++;
}

}



Tried looking on the FBX forums but had no luck. I could hard code them individually but I was looking to create a system where all my resources are loaded. Any help would be greatly appreciated. Thanks!

##### Share on other sites

This is typical when using content from DCC packages in D3D. D3D specifies that V = 0 is the top of the texture and OpenGL specifies that V = 0 is the bottom, so you usually have to flip the V coordinates (and the bitangents!) when generating meshes for D3D.

##### Share on other sites

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

Edited by JacobM

##### Share on other sites

Is your coordinate system handness coorect? 3DS Max is right-handed, while DX if left-handed.

If you export it as RHS, then in DX all of your triangles will be facing inward, causing back-face culling to cull the wrong triangles. Which means you are seeing the inside of the sphere, creating the apperance of U coordinate mirroring.

You can test this theory by rotating the sphere - if I'm correct, then it will appear to rotate in the wrong direction.

##### Share on other sites

Check you export settings. Probablly the problem is in the UVW mapping. Use the UVW modifination(if you're using 3ds max) and specify the mapping type. If you drag and drop the texture the default the mapping generated will be OGL style(not shure im always using UVW modifier). You can tell 3ds max to generate fliped UVs

##### Share on other sites

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

##### Share on other sites

Satanir:

This is what I originally thought as well. However culling was turned on and it was culling the correct triangles. i inverted the mesh in 3Ds max and it indeed fixed the texture. But now i was seeing inside of the sphere. Also, if that was the case then i assume the spaceships textures would have been inverted as well, but they come out just fine.
But I am sure that the sphere is being drawn correctly so I am not understanding why I am getting such mixed results.

imoogiBG:
I have used the UV unwrap, and the UV Xform. i even tried the UV Map modifier as u suggested. And you're right I could just flip the texture backwards in 3Ds this way it comes out correctly when I load it, but I would really like to get to the source of this. Not sure what other problems I may run into down the line. But I just find it weird that complex objects come out correctly while standard primitives like cubes and spheres come out like this. I think that may play some kind of role in this.

##### Share on other sites

Satanir, I guess u you were right, feel like a dunce now.

After a little reading I found all I had to do was switch the position Y and position Z of the verts like so:

FbxVector4 position = fbxMesh->GetControlPointAt(controlPointIndex);
vS.position = D3DXVECTOR3(position[0], position[2], position[1]);

FbxVector2 tex;
bool isMapped;
fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

FbxVector4 norm;
fbxMesh->GetPolygonVertexNormal(p, v, norm);
vS.normal = D3DXVECTOR3(-norm[0], -norm[2], -norm[1]);


Thanks for the help. Hopefully this will also help someone else with the same issue.

• 11
• 10
• 12
• 9
• 16
• ### Similar Content

• I wanted to see how others are currently handling descriptor heap updates and management.
I've read a few articles and there tends to be three major strategies :
1 ) You split up descriptor heaps per shader stage ( i.e one for vertex shader , pixel , hull, etc)
2) You have one descriptor heap for an entire pipeline
3) You split up descriptor heaps for update each update frequency (i.e EResourceSet_PerInstance , EResourceSet_PerPass , EResourceSet_PerMaterial, etc)
The benefits of the first two approaches is that it makes it easier to port current code, and descriptor / resource descriptor management and updating tends to be easier to manage, but it seems to be not as efficient.
The benefits of the third approach seems to be that it's the most efficient because you only manage and update objects when they change.

• hi,
until now i use typical vertexshader approach for skinning with a Constantbuffer containing the transform matrix for the bones and an the vertexbuffer containing bone index and bone weight.
Now i have implemented realtime environment  probe cubemaping so i have to render my scene from many point of views and the time for skinning takes too long because it is recalculated for every side of the cubemap.
For Info i am working on Win7 an therefore use one Shadermodel 5.0 not 5.x that have more options, or is there a way to use 5.x in Win 7
My Graphic Card is Directx 12 compatible NVidia GTX 960
the member turanszkij has posted a good for me understandable compute shader. ( for Info: in his engine he uses an optimized version of it )
Now my questions
is it possible to feed the compute shader with my orignial vertexbuffer or do i have to copy it in several ByteAdressBuffers as implemented in the following code ?
the same question is about the constant buffer of the matrixes
my more urgent question is how do i feed my normal pipeline with the result of the compute Shader which are 2 RWByteAddressBuffers that contain position an normal
for example i could use 2 vertexbuffer bindings
1 containing only the uv coordinates
2.containing position and normal
How do i copy from the RWByteAddressBuffers to the vertexbuffer ?

(Code from turanszkij )
Here is my shader implementation for skinning a mesh in a compute shader:

• Hi, can someone please explain why this is giving an assertion EyePosition!=0 exception?

It looks like DirectX doesnt want the 2nd parameter to be a zero vector in the assertion, but I passed in a zero vector with this exact same code in another program and it ran just fine. (Here is the version of the code that worked - note XMLoadFloat3(&m_lookAt) parameter value is (0,0,0) at runtime - I debugged it - but it throws no exceptions.
and here is the repo with the alternative version of the code that is working with a value of (0,0,0) for the second parameter.

• Hi, can somebody please tell me in clear simple steps how to debug and step through an hlsl shader file?
I already did Debug > Start Graphics Debugging > then captured some frames from Visual Studio and
double clicked on the frame to open it, but no idea where to go from there.

I've been searching for hours and there's no information on this, not even on the Microsoft Website!
They say "open the  Graphics Pixel History window" but there is no such window!
Then they say, in the "Pipeline Stages choose Start Debugging"  but the Start Debugging option is nowhere to be found in the whole interface.
Also, how do I even open the hlsl file that I want to set a break point in from inside the Graphics Debugger?

All I want to do is set a break point in a specific hlsl file, step thru it, and see the data, but this is so unbelievably complicated