Sign in to follow this  
Jacob Mnasin

DX11 UV problem with FBX and DX11

Recommended Posts

Jacob Mnasin    160

Hello All!
Came by here today hoping somebody has some experience loading models into DX11 using the FBX SDK. I am
having a weird problem when I load in my texture coordinates. On some models, they render correctly while on others, they render with the "U" flipped. An example below:
uvissue.jpg

The space ship is rendered correctly, while the planet texture is flipped. Both are using the same code to load them in, the only difference is that I am sending in different strings for where to fetch the file from.

 

I load in the vertices from FBX in the following manner:

	int counter = 0;
	//Loop through the polygons
	for (int p = 0; p < fbxMesh->GetPolygonCount(); p++)
	{
		//Loop through the three vertices within each polygon
		for(int v = 0; v < fbxMesh->GetPolygonSize(p); v++)
		{

			const int controlPointIndex = fbxMesh->GetPolygonVertex(p, v);

			VertexStruct vS;
			vS.position = FBXD3DVector3(fbxMesh->GetControlPointAt(controlPointIndex));
			
			FbxVector2 tex;
			bool isMapped;
			fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
			vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

			FbxVector4 norm;
			fbxMesh->GetPolygonVertexNormal(p, v, norm);
			vS.normal = D3DXVECTOR3(-norm[0], -norm[1], -norm[2]);

			m_Vertices.push_back(vS);

			m_Indices.push_back(static_cast<unsigned int>(counter));

			counter++;
		}
		
	}

Tried looking on the FBX forums but had no luck. I could hard code them individually but I was looking to create a system where all my resources are loaded. Any help would be greatly appreciated. Thanks!

Share this post


Link to post
Share on other sites
MJP    19786

This is typical when using content from DCC packages in D3D. D3D specifies that V = 0 is the top of the texture and OpenGL specifies that V = 0 is the bottom, so you usually have to flip the V coordinates (and the bitangents!) when generating meshes for D3D.

Share this post


Link to post
Share on other sites
Jacob Mnasin    160

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

Edited by JacobM

Share this post


Link to post
Share on other sites
satanir    1452

Is your coordinate system handness coorect? 3DS Max is right-handed, while DX if left-handed.

If you export it as RHS, then in DX all of your triangles will be facing inward, causing back-face culling to cull the wrong triangles. Which means you are seeing the inside of the sphere, creating the apperance of U coordinate mirroring.

 

You can test this theory by rotating the sphere - if I'm correct, then it will appear to rotate in the wrong direction.

Share this post


Link to post
Share on other sites
ongamex92    3256

Check you export settings. Probablly the problem is in the UVW mapping. Use the UVW modifination(if you're using 3ds max) and specify the mapping type. If you drag and drop the texture the default the mapping generated will be OGL style(not shure im always using UVW modifier). You can tell 3ds max to generate fliped UVs

Share this post


Link to post
Share on other sites
MJP    19786

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

 

Whoops, I mis-read your post. Sorry about that.

Share this post


Link to post
Share on other sites
Jacob Mnasin    160

Satanir:

This is what I originally thought as well. However culling was turned on and it was culling the correct triangles. i inverted the mesh in 3Ds max and it indeed fixed the texture. But now i was seeing inside of the sphere. Also, if that was the case then i assume the spaceships textures would have been inverted as well, but they come out just fine. 
But I am sure that the sphere is being drawn correctly so I am not understanding why I am getting such mixed results.

imoogiBG:
I have used the UV unwrap, and the UV Xform. i even tried the UV Map modifier as u suggested. And you're right I could just flip the texture backwards in 3Ds this way it comes out correctly when I load it, but I would really like to get to the source of this. Not sure what other problems I may run into down the line. But I just find it weird that complex objects come out correctly while standard primitives like cubes and spheres come out like this. I think that may play some kind of role in this.
 

Share this post


Link to post
Share on other sites
Jacob Mnasin    160

Satanir, I guess u you were right, feel like a dunce now.

After a little reading I found all I had to do was switch the position Y and position Z of the verts like so:

FbxVector4 position = fbxMesh->GetControlPointAt(controlPointIndex);
vS.position = D3DXVECTOR3(position[0], position[2], position[1]); 
			
FbxVector2 tex;
bool isMapped;
fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

FbxVector4 norm;
fbxMesh->GetPolygonVertexNormal(p, v, norm);
vS.normal = D3DXVECTOR3(-norm[0], -norm[2], -norm[1]);

Thanks for the help. Hopefully this will also help someone else with the same issue.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By noodleBowl
      I've gotten to part in my DirectX 11 project where I need to pass the MVP matrices to my vertex shader. And I'm a little lost when it comes to the use of the constant buffer with the vertex shader
      I understand I need to set up the constant buffer just like any other buffer:
      1. Create a buffer description with the D3D11_BIND_CONSTANT_BUFFER flag 2. Map my matrix data into the constant buffer 3. Use VSSetConstantBuffers to actually use the buffer But I get lost at the VertexShader part, how does my vertex shader know to use this constant buffer when we get to the shader side of things
      In the example I'm following I see they have this as their vertex shader, but I don't understand how the shader knows to use the MatrixBuffer cbuffer. They just use the members directly. What if there was multiple cbuffer declarations like the Microsoft documentation says you could have?
      //Inside vertex shader cbuffer MatrixBuffer { matrix worldMatrix; matrix viewMatrix; matrix projectionMatrix; }; struct VertexInputType { float4 position : POSITION; float4 color : COLOR; }; struct PixelInputType { float4 position : SV_POSITION; float4 color : COLOR; }; PixelInputType ColorVertexShader(VertexInputType input) { PixelInputType output; // Change the position vector to be 4 units for proper matrix calculations. input.position.w = 1.0f; // Calculate the position of the vertex against the world, view, and projection matrices. output.position = mul(input.position, worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); // Store the input color for the pixel shader to use. output.color = input.color; return output; }  
    • By gomidas
      I am trying to add normal map to my project I have an example of a cube: 
      I have normal in my shader I think. Then I set shader resource view for texture (NOT BUMP)
                  device.ImmediateContext.PixelShader.SetShaderResource(0, textureView);             device.ImmediateContext.Draw(VerticesCount,0); What should I do to set my normal map or how it is done in dx11 generally example c++?
    • By fighting_falcon93
      Imagine that we have a vertex structure that looks like this:
      struct Vertex { XMFLOAT3 position; XMFLOAT4 color; }; The vertex shader looks like this:
      cbuffer MatrixBuffer { matrix world; matrix view; matrix projection; }; struct VertexInput { float4 position : POSITION; float4 color : COLOR; }; struct PixelInput { float4 position : SV_POSITION; float4 color : COLOR; }; PixelInput main(VertexInput input) { PixelInput output; input.position.w = 1.0f; output.position = mul(input.position, world); output.position = mul(output.position, view); output.position = mul(output.position, projection); output.color = input.color; return output; } And the pixel shader looks like this:
      struct PixelInput { float4 position : SV_POSITION; float4 color : COLOR; }; float4 main(PixelInput input) : SV_TARGET { return input.color; } Now let's create a quad consisting of 2 triangles and the vertices A, B, C and D:
      // Vertex A. vertices[0].position = XMFLOAT3(-1.0f, 1.0f, 0.0f); vertices[0].color = XMFLOAT4( 0.5f, 0.5f, 0.5f, 1.0f); // Vertex B. vertices[1].position = XMFLOAT3( 1.0f, 1.0f, 0.0f); vertices[1].color = XMFLOAT4( 0.5f, 0.5f, 0.5f, 1.0f); // Vertex C. vertices[2].position = XMFLOAT3(-1.0f, -1.0f, 0.0f); vertices[2].color = XMFLOAT4( 0.5f, 0.5f, 0.5f, 1.0f); // Vertex D. vertices[3].position = XMFLOAT3( 1.0f, -1.0f, 0.0f); vertices[3].color = XMFLOAT4( 0.5f, 0.5f, 0.5f, 1.0f); // 1st triangle. indices[0] = 0; // Vertex A. indices[1] = 3; // Vertex D. indices[2] = 2; // Vertex C. // 2nd triangle. indices[3] = 0; // Vertex A. indices[4] = 1; // Vertex B. indices[5] = 3; // Vertex D. This will result in a grey quad as shown in the image below. I've outlined the edges in red color to better illustrate the triangles:

      Now imagine that we’d want our quad to have a different color in vertex A:
      // Vertex A. vertices[0].position = XMFLOAT3(-1.0f, 1.0f, 0.0f); vertices[0].color = XMFLOAT4( 0.0f, 0.0f, 0.0f, 1.0f);
      That works as expected since there’s now an interpolation between the black color in vertex A and the grey color in vertices B, C and D. Let’s revert the previus changes and instead change the color of vertex C:
      // Vertex C. vertices[2].position = XMFLOAT3(-1.0f, -1.0f, 0.0f); vertices[2].color = XMFLOAT4( 0.0f, 0.0f, 0.0f, 1.0f);
      As you can see, the interpolation is only done half of the way across the first triangle and not across the entire quad. This is because there's no edge between vertex C and vertex B.
      Which brings us to my question:
      I want the interpolation to go across the entire quad and not only across the triangle. So regardless of which vertex we decide to change the color of, the color interpolation should always go across the entire quad. Is there any efficient way of achieving this without adding more vertices and triangles?
      An illustration of what I'm trying to achieve is shown in the image below:

       
      Background
      This is just a very brief explanation of the problems background in case that would make it easier for you to understand the problems roots and maybe help you with finding a better solution to the problem.
      I'm trying to texture a terrain mesh in DirectX11. It's working, but I'm a bit unsatisfied with the result. When changing the terrain texture of a single vertex, the interpolation with the other vertices results in a hexagon shape instead of a squared shape:

      As the red arrows illustrate, I'd like the texture to be interpolated all the way into the corners of the quads.
    • By -Tau-
      Hello, I'm close to releasing my first game to Steam however, my game keeps failing the review process because it keeps crashing. The problem is that the game doesn't crash on my computer, on my laptop, on our family computer, on fathers laptop and i also gave 3 beta keys to people i know and they said the game hasn't crashed.
      Steam reports that the game doesn't crash on startup but few frames after a level has been started.
      What could cause something like this? I have no way of debugging this as the game works fine on every computer i have.
       
      Game is written in C++, using DirectX 11 and DXUT framework.
    • By haiiry
      I'm trying to get, basically, screenshot (each 1 second, without saving) of Direct3D11 application. Code works fine on my PC(Intel CPU, Radeon GPU) but crashes after few iterations on 2 others (Intel CPU + Intel integrated GPU, Intel CPU + Nvidia GPU).
      void extractBitmap(void* texture) { if (texture) { ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)texture; ID3D11Texture2D* pNewTexture = NULL; D3D11_TEXTURE2D_DESC desc; d3dtex->GetDesc(&desc); desc.BindFlags = 0; desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE; desc.Usage = D3D11_USAGE_STAGING; desc.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB; HRESULT hRes = D3D11Device->CreateTexture2D(&desc, NULL, &pNewTexture); if (FAILED(hRes)) { printCon(std::string("CreateTexture2D FAILED:" + format_error(hRes)).c_str()); if (hRes == DXGI_ERROR_DEVICE_REMOVED) printCon(std::string("DXGI_ERROR_DEVICE_REMOVED -- " + format_error(D3D11Device->GetDeviceRemovedReason())).c_str()); } else { if (pNewTexture) { D3D11DeviceContext->CopyResource(pNewTexture, d3dtex); // Wokring with texture pNewTexture->Release(); } } } return; } D3D11SwapChain->GetBuffer(0, __uuidof(ID3D11Texture2D), reinterpret_cast< void** >(&pBackBuffer)); extractBitmap(pBackBuffer); pBackBuffer->Release(); Crash log:
      CreateTexture2D FAILED:887a0005 DXGI_ERROR_DEVICE_REMOVED -- 887a0020 Once I comment out 
      D3D11DeviceContext->CopyResource(pNewTexture, d3dtex); 
      code works fine on all 3 PC's.
  • Popular Now