Sign in to follow this  
Jacob Mnasin

DX11 UV problem with FBX and DX11

Recommended Posts

Hello All!
Came by here today hoping somebody has some experience loading models into DX11 using the FBX SDK. I am
having a weird problem when I load in my texture coordinates. On some models, they render correctly while on others, they render with the "U" flipped. An example below:
uvissue.jpg

The space ship is rendered correctly, while the planet texture is flipped. Both are using the same code to load them in, the only difference is that I am sending in different strings for where to fetch the file from.

 

I load in the vertices from FBX in the following manner:

	int counter = 0;
	//Loop through the polygons
	for (int p = 0; p < fbxMesh->GetPolygonCount(); p++)
	{
		//Loop through the three vertices within each polygon
		for(int v = 0; v < fbxMesh->GetPolygonSize(p); v++)
		{

			const int controlPointIndex = fbxMesh->GetPolygonVertex(p, v);

			VertexStruct vS;
			vS.position = FBXD3DVector3(fbxMesh->GetControlPointAt(controlPointIndex));
			
			FbxVector2 tex;
			bool isMapped;
			fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
			vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

			FbxVector4 norm;
			fbxMesh->GetPolygonVertexNormal(p, v, norm);
			vS.normal = D3DXVECTOR3(-norm[0], -norm[1], -norm[2]);

			m_Vertices.push_back(vS);

			m_Indices.push_back(static_cast<unsigned int>(counter));

			counter++;
		}
		
	}

Tried looking on the FBX forums but had no luck. I could hard code them individually but I was looking to create a system where all my resources are loaded. Any help would be greatly appreciated. Thanks!

Share this post


Link to post
Share on other sites

This is typical when using content from DCC packages in D3D. D3D specifies that V = 0 is the top of the texture and OpenGL specifies that V = 0 is the bottom, so you usually have to flip the V coordinates (and the bitangents!) when generating meshes for D3D.

Share this post


Link to post
Share on other sites

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

Edited by JacobM

Share this post


Link to post
Share on other sites

Is your coordinate system handness coorect? 3DS Max is right-handed, while DX if left-handed.

If you export it as RHS, then in DX all of your triangles will be facing inward, causing back-face culling to cull the wrong triangles. Which means you are seeing the inside of the sphere, creating the apperance of U coordinate mirroring.

 

You can test this theory by rotating the sphere - if I'm correct, then it will appear to rotate in the wrong direction.

Share this post


Link to post
Share on other sites

Check you export settings. Probablly the problem is in the UVW mapping. Use the UVW modifination(if you're using 3ds max) and specify the mapping type. If you drag and drop the texture the default the mapping generated will be OGL style(not shure im always using UVW modifier). You can tell 3ds max to generate fliped UVs

Share this post


Link to post
Share on other sites

Thanks for the reply. Yes I understand that...however my problem seems to be with the 'U' coordinates. The planet is flipped ( so you have Asia on the western hemisphere and the Americas on the east.). The mesh was just a simple sphere that I exported from 3Ds Max, applied a mesh modifier, applied a texture to it and unwrapped it. The models I downloaded from TurboSquid. I tried 2 different artists and both loaded correctly. The same happened when I tried it on a cube.

 

Whoops, I mis-read your post. Sorry about that.

Share this post


Link to post
Share on other sites

Satanir:

This is what I originally thought as well. However culling was turned on and it was culling the correct triangles. i inverted the mesh in 3Ds max and it indeed fixed the texture. But now i was seeing inside of the sphere. Also, if that was the case then i assume the spaceships textures would have been inverted as well, but they come out just fine. 
But I am sure that the sphere is being drawn correctly so I am not understanding why I am getting such mixed results.

imoogiBG:
I have used the UV unwrap, and the UV Xform. i even tried the UV Map modifier as u suggested. And you're right I could just flip the texture backwards in 3Ds this way it comes out correctly when I load it, but I would really like to get to the source of this. Not sure what other problems I may run into down the line. But I just find it weird that complex objects come out correctly while standard primitives like cubes and spheres come out like this. I think that may play some kind of role in this.
 

Share this post


Link to post
Share on other sites

Satanir, I guess u you were right, feel like a dunce now.

After a little reading I found all I had to do was switch the position Y and position Z of the verts like so:

FbxVector4 position = fbxMesh->GetControlPointAt(controlPointIndex);
vS.position = D3DXVECTOR3(position[0], position[2], position[1]); 
			
FbxVector2 tex;
bool isMapped;
fbxMesh->GetPolygonVertexUV(p, v, lUVSetName, tex, isMapped);
vS.uv = D3DXVECTOR2(tex[0], -tex[1]);

FbxVector4 norm;
fbxMesh->GetPolygonVertexNormal(p, v, norm);
vS.normal = D3DXVECTOR3(-norm[0], -norm[2], -norm[1]);

Thanks for the help. Hopefully this will also help someone else with the same issue.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Announcements

  • Forum Statistics

    • Total Topics
      628294
    • Total Posts
      2981882
  • Similar Content

    • By GreenGodDiary
      I'm attempting to implement some basic post-processing in my "engine" and the HLSL part of the Compute Shader and such I think I've understood, however I'm at a loss at how to actually get/use it's output for rendering to the screen.
      Assume I'm doing something to a UAV in my CS:
      RWTexture2D<float4> InputOutputMap : register(u0); I want that texture to essentially "be" the backbuffer.
       
      I'm pretty certain I'm doing something wrong when I create the views (what I think I'm doing is having the backbuffer be bound as render target aswell as UAV and then using it in my CS):
       
      DXGI_SWAP_CHAIN_DESC scd; ZeroMemory(&scd, sizeof(DXGI_SWAP_CHAIN_DESC)); scd.BufferCount = 1; scd.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; scd.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT | DXGI_USAGE_SHADER_INPUT | DXGI_USAGE_UNORDERED_ACCESS; scd.OutputWindow = wndHandle; scd.SampleDesc.Count = 1; scd.Windowed = TRUE; HRESULT hr = D3D11CreateDeviceAndSwapChain(NULL, D3D_DRIVER_TYPE_HARDWARE, NULL, NULL, NULL, NULL, D3D11_SDK_VERSION, &scd, &gSwapChain, &gDevice, NULL, &gDeviceContext); // get the address of the back buffer ID3D11Texture2D* pBackBuffer = nullptr; gSwapChain->GetBuffer(0, __uuidof(ID3D11Texture2D), (LPVOID*)&pBackBuffer); // use the back buffer address to create the render target gDevice->CreateRenderTargetView(pBackBuffer, NULL, &gBackbufferRTV); // set the render target as the back buffer CreateDepthStencilBuffer(); gDeviceContext->OMSetRenderTargets(1, &gBackbufferRTV, depthStencilView); //UAV for compute shader D3D11_UNORDERED_ACCESS_VIEW_DESC uavd; ZeroMemory(&uavd, sizeof(uavd)); uavd.Format = DXGI_FORMAT_R8G8B8A8_UNORM; uavd.ViewDimension = D3D11_UAV_DIMENSION_TEXTURE2D; uavd.Texture2D.MipSlice = 1; gDevice->CreateUnorderedAccessView(pBackBuffer, &uavd, &gUAV); pBackBuffer->Release();  
      After I render the scene, I dispatch like this:
      gDeviceContext->OMSetRenderTargets(0, NULL, NULL); m_vShaders["cs1"]->Bind(); gDeviceContext->CSSetUnorderedAccessViews(0, 1, &gUAV, 0); gDeviceContext->Dispatch(32, 24, 0); //hard coded ID3D11UnorderedAccessView* nullview = { nullptr }; gDeviceContext->CSSetUnorderedAccessViews(0, 1, &nullview, 0); gDeviceContext->OMSetRenderTargets(1, &gBackbufferRTV, depthStencilView); gSwapChain->Present(0, 0); Worth noting is the scene is rendered as usual, but I dont get any results from the CS (simple gaussian blur)
      I'm sure it's something fairly basic I'm doing wrong, perhaps my understanding of render targets / views / what have you is just completely wrong and my approach just makes no sense.

      If someone with more experience could point me in the right direction I would really appreciate it!

      On a side note, I'd really like to learn more about this kind of stuff. I can really see the potential of the CS aswell as rendering to textures and using them for whatever in the engine so I would love it if you know some good resources I can read about this!

      Thank you <3
       
      P.S I excluded the .hlsl since I cant imagine that being the issue, but if you think you need it to help me just ask

      P:P:S. As you can see this is my first post however I do have another account, but I can't log in with it because gamedev.net just keeps asking me to accept terms and then logs me out when I do over and over
    • By noodleBowl
      I was wondering if anyone could explain the depth buffer and the depth stencil state comparison function to me as I'm a little confused
      So I have set up a depth stencil state where the DepthFunc is set to D3D11_COMPARISON_LESS, but what am I actually comparing here? What is actually written to the buffer, the pixel that should show up in the front?
      I have these 2 quad faces, a Red Face and a Blue Face. The Blue Face is further away from the Viewer with a Z index value of -100.0f. Where the Red Face is close to the Viewer with a Z index value of 0.0f.
      When DepthFunc is set to D3D11_COMPARISON_LESS the Red Face shows up in front of the Blue Face like it should based on the Z index values. BUT if I change the DepthFunc to D3D11_COMPARISON_LESS_EQUAL the Blue Face shows in front of the Red Face. Which does not make sense to me, I would think that when the function is set to D3D11_COMPARISON_LESS_EQUAL the Red Face would still show up in front of the Blue Face as the Z index for the Red Face is still closer to the viewer
      Am I thinking of this comparison function all wrong?
      Vertex data just in case
      //Vertex date that make up the 2 faces Vertex verts[] = { //Red face Vertex(Vector4(0.0f, 0.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), Vertex(Vector4(100.0f, 100.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), Vertex(Vector4(100.0f, 0.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), Vertex(Vector4(0.0f, 0.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), Vertex(Vector4(0.0f, 100.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), Vertex(Vector4(100.0f, 100.0f, 0.0f), Color(1.0f, 0.0f, 0.0f)), //Blue face Vertex(Vector4(0.0f, 0.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), Vertex(Vector4(100.0f, 100.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), Vertex(Vector4(100.0f, 0.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), Vertex(Vector4(0.0f, 0.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), Vertex(Vector4(0.0f, 100.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), Vertex(Vector4(100.0f, 100.0f, -100.0f), Color(0.0f, 0.0f, 1.0f)), };  
    • By mellinoe
      Hi all,
      First time poster here, although I've been reading posts here for quite a while. This place has been invaluable for learning graphics programming -- thanks for a great resource!
      Right now, I'm working on a graphics abstraction layer for .NET which supports D3D11, Vulkan, and OpenGL at the moment. I have implemented most of my planned features already, and things are working well. Some remaining features that I am planning are Compute Shaders, and some flavor of read-write shader resources. At the moment, my shaders can just get simple read-only access to a uniform (or constant) buffer, a texture, or a sampler. Unfortunately, I'm having a tough time grasping the distinctions between all of the different kinds of read-write resources that are available. In D3D alone, there seem to be 5 or 6 different kinds of resources with similar but different characteristics. On top of that, I get the impression that some of them are more or less "obsoleted" by the newer kinds, and don't have much of a place in modern code. There seem to be a few pivots:
      The data source/destination (buffer or texture) Read-write or read-only Structured or unstructured (?) Ordered vs unordered (?) These are just my observations based on a lot of MSDN and OpenGL doc reading. For my library, I'm not interested in exposing every possibility to the user -- just trying to find a good "middle-ground" that can be represented cleanly across API's which is good enough for common scenarios.
      Can anyone give a sort of "overview" of the different options, and perhaps compare/contrast the concepts between Direct3D, OpenGL, and Vulkan? I'd also be very interested in hearing how other folks have abstracted these concepts in their libraries.
    • By turanszkij
      If I do a buffer update with MAP_NO_OVERWRITE or MAP_DISCARD, can I just write to the buffer after I called Unmap() on the buffer? It seems to work fine for me (Nvidia driver), but is it actually legal to do so? I have a graphics device wrapper and I don't want to expose Map/Unmap, but just have a function like void* AllocateFromRingBuffer(GPUBuffer* buffer, uint size, uint& offset); This function would just call Map on the buffer, then Unmap immediately and then return the address of the buffer. It usually does a MAP_NO_OVERWRITE, but sometimes it is a WRITE_DISCARD (when the buffer wraps around). Previously I have been using it so that the function expected the data upfront and would copy to the buffer between Map/Unmap, but now I want to extend functionality of it so that it would just return an address to write to.
    • By mister345
      Trying to write a multitexturing shader in DirectX11 - 3 textures work fine, but adding 4th gets sampled as black!
      Could you please look at the textureClass.cpp line 79? - I'm guess its D3D11_TEXTURE2D_DESC settings are wrong, 
      but no idea how to set it up right. I tried changing ArraySize from 1 to 4, but does nothing. If thats not the issue, please look
      at the LightShader_ps - maybe doing something wrong there? Otherwise, no idea.
          // Setup the description of the texture.
          textureDesc.Height = height;
          textureDesc.Width = width;
          textureDesc.MipLevels = 0;
          textureDesc.ArraySize = 1;
          textureDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM;
          textureDesc.SampleDesc.Count = 1;
          textureDesc.SampleDesc.Quality = 0;
          textureDesc.Usage = D3D11_USAGE_DEFAULT;
          textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
          textureDesc.CPUAccessFlags = 0;
          textureDesc.MiscFlags = D3D11_RESOURCE_MISC_GENERATE_MIPS;
      Please help, thanks.
      https://github.com/mister51213/DirectX11Engine/blob/master/DirectX11Engine/Texture.cpp
       
  • Popular Now