Jump to content
  • Advertisement

Anonymous Noob

  • Content Count

  • Joined

  • Last visited

Community Reputation

123 Neutral

About Anonymous Noob

  • Rank
  1. Anonymous Noob

    Doubt with textures

    I just installed Blender 2.7 to check it out and unfortunately it still doesn't export material textures correctly to FBX -- or indeed, at all. Perhaps someone in the Blender community has made an alternate FBX exporter that works correctly.
  2. Anonymous Noob

    Doubt with textures

      I've just done this myself for the first time so I'll share my experience.   If materials were assigned to the mesh in the 3D program then that information should carry over into the FBX representation. When you're traversing the FBX file you'll find there's a mapping of geometry to material ID. This mapping can be done in different ways so you'll have to examine the FBX sample "ImportScene" to figure out what you need to do to extract that data under different scenarios. Perhaps the best thing to do is run that sample with one of your meshes and pipe its output to a text file. If you do that it's easy enough to track the various bits of output back to the right parts of the sample source and then duplicate that in your own loader/converter.   In any case, the point is that the materials set in the 3D program should be mapped to the mesh data by the exporter. For example, if the materials are mapped per polygon then each polygon in the FBX will have a material ID associated with it. Your submeshes, then, should consist of a subset of the polygons per material. If polys 1,2,3,4,5 are material 0 and 6,7,8,9 are material 1, then those are your two submeshes.   I create a single index and vertex buffer for the overall model, and the submeshes contain a count of indices, an offset in to the index buffer to start from, and a material ID. Then my draw code goes something like this:   SetVertexBuffer( model->vbuffer ); SetIndexBuffer( model->ibuffer ); foreach submesh in model:      SendMaterialVariablesToConstantBuffer( model->materials[submesh->materialID]->psvariables );      SendTexturesToPixelShader( model->materials[submesh->materialID]->textures );      DrawIndexed( submesh->index_count, submesh->index_offset, 0 );   Note that this code is obviously representative rather than literal. It's also not the ideal way to set it up for batch-rendering a bunch of the same thing. But it does get the point across, if you just wanted to display the thing in a viewer for example.   Now as for extracting the materials from the FBX you'll also find that in the sample. However, you'll also find that the Blender FBX exporter doesn't work properly when it comes to textures. AFAIK the texture it exports is the one set in the UV mapper and not the texture(s) set in the material's properties. That may have changed since I last looked at Blender but it's something to be aware of.   In either case you'll have to create a representation of the material information and store that as part of your model data. The exact form that data takes will be entirely up to you.
  3. Solved. So after all that it turned out to be the order of operations in my handler for WM_SIZE. There's a very good reason why it was acting like there was no depth buffer: the call to OMSetRenderTargets was BEFORE I recreated the depth view. Oh, deary me. Obvious really when I saw it. And of course OMSetRenderTargets has to be a void-return; an HRESULT_STUPID_USER return might have saved me the time.   Mona, you were spot on. Thanks for pointing me in the right direction.   ** edit   In retrospect, you know what else should have been a dead giveaway? The linestream coming out of my geometry shader, back when I thought it might have been poly winding. The lines weren't depth-clipped at all. Bloody hell I must have taken a stupid pill this morning.
  4. That would be nice but if that were the case wouldn't everything be drawing incorrectly? In an effort to nail down the problem I've been running a very simplified shader on this: cbuffer mat_camera : register( b0 ) { matrix gView; matrix gProjection; float3 gCamPosition; float3 gCamTarget; }; cbuffer mat_object : register( b1 ) { matrix gWorld; }; struct VS_IN { float3 position : POSITION; float3 normal : NORMAL; float3 tangent : TANGENT; float3 binormal : BINORMAL; float2 uv : TEXCOORD0; }; struct VS_OUT { float4 position : SV_POSITION; float3 normal : NORMAL; float3 tangent : TANGENT; float3 binormal : BINORMAL; float2 uv : TEXCOORD0; }; VS_OUT main( VS_IN IN ) { VS_OUT OUT; OUT.position = float4( IN.position.xyz, 1.0 ); OUT.position = mul( OUT.position, gWorld ); OUT.position = mul( OUT.position, gView ); OUT.position = mul( OUT.position, gProjection ); OUT.normal = normalize( mul( IN.normal, ( float3x3 )gWorld ) ); OUT.tangent = IN.tangent; OUT.binormal = IN.binormal; OUT.uv = IN.uv; return OUT; } Texture2D gTex; SamplerState gSampler; struct PS_IN { float4 position : SV_POSITION; float3 normal : NORMAL; float3 tangent : TANGENT; float3 binormal : BINORMAL; float2 uv : TEXCOORD0; }; float4 main( PS_IN IN ) : SV_TARGET { float4 pixel = gTex.Sample( gSampler, IN.uv ); return pixel; }
  5. Apologies if this is in the wrong forum as perhaps it's a general problem, but I am specifically using the DirectX 11 runtime. I'm sure my problem is some basic matter that I just don't understand yet and I'm hopeful someone can help. I'm new to graphics, and sometimes I don't know what it is that I don't yet know.   Here's my situation. I have a dirt-simple DX11 renderer into which I'm loading mesh data that was parsed in from FBX (created in Maya and Blender). As long as the mesh is a simple primitive like a cube or a sphere everything works wonderfully. The mesh is recreated faithfully on my screen, and in fact I was getting to work on pixel shaders when I noticed something odd. If the mesh is less simple -- say, if I grabbed a face in Maya and moved or extruded it a bit -- then that extruded geometry does not clip properly when I render the mesh. Exhibit A: As you can see, the modified geometry doesn't cull correctly. I didn't know what to make of that so I made a couple of geometry shaders to help me debug. That's what you see in the video. The lines are the calculated face normals and the vertex colors are coded R, G, B in the order received by the geometry shader. I had originally thought that the offending polys might be getting wound cw rather than ccw as a result of operations in Maya, but they all appear to be ccw. The calculated normals also appear to be pointing the right way. I've looked at my depth/stencil states and they appear fine, but I expected that since I can draw spheres and cubes and such correctly all day. My near clip plane isn't set to zero; I thought of that. Frankly I don't know what else to check. This has been driving me batty all day. Please send help.   ** edit Just wanted to add that I do have the debug layers going and I'm checking my HRESULTS and nothing is amiss as far as that goes. And I'm drawing with an opaque blend state.
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!