Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 27 Oct 2008
Offline Last Active Today, 02:54 AM

Topics I've Started

Animation debugging nightmare...

09 February 2014 - 07:48 PM



I’m trying to do skinning animation with Assimp and D3D11 in VS2012. I have managed to get things looking ok but I have a problem as seen in this image:




No, that is not his long luscious hair. It appears some triangles are completely wrong. I have checked the bones and the vertex information that is retrieved from the file and everything seems right. I also tried recreating the vertex shader on the CPU to see if I could replicate the problem, but there was no problem there either. So I tried debugging a pixel (on his "hair") and I got these values:




This shows that the bone weights are completely wrong (because they should add up to 1.0). So this must mean that the vertex I sent to the GPU is wrong, right? Going back to the CPU side, these are the exact values I sent (for vertex 301):




The highlighted stuff shows that the bone weights do indeed add up to 1.0. So when I check on the GPU side, I see this:




The red part shows that the bone weights are there on the GPU. The interesting thing is that the highlighted green stuff are the actual values that the bone weights are given. It is sort of like the vertex buffer has gone out of synch or something (but I am sure this is a user error). I have made sure that I have sent the correct vertex buffer size and vertex struct size.


Sorry, I know this is a complicated question, but I'm not sure what else to try. I'm not sure what code to give you, so just ask and ye shall receive. I'm also not sure if this is a D3D problem or an animation problem or whatnot. So does anyone have any idea what the problem could be or things that I could try?



[Assimp] Getting bind pose bone positions

04 February 2014 - 02:19 AM

I'm trying to get my head around animation and Assimp. I'm trying to just get the bone positions in bind pose (no animation or anything like that). According to various sources, I should be able to use aiBone::mOffsetMatrix. The way I am doing it is (psuedo code):

MthVector4 v(0.0f, 0.0f, 0.0f, 1.0f);
MthMatrix invOffset = mOffsetMatrix;
invOffset.Transform(v, v);
pos = (MthVector3&)v / v.w;

pos should give the position in model space, right? I've spent a while fiddling around with this stuff for a while but nothing looks right at all. So what is the correct way to do this?



Programs going way too slow since moving to a new computer...

16 December 2013 - 08:25 PM

I recently moved over to using a new laptop and I have started a new project in visual studio 2012 but everything runs slower than it should. I tried profiling it with the VS profiler and it says that 99.38% of the time is spent in d3d11ref.dll. I have reinstalled the DirectX SDK. I tried an older project that worked fine on another computer and it was also running slow, so I don't think it has anything to do with my code. The laptop is quite good (i7 processor). The program I'm currently testing is just a basic game loop that opens up a blank window and does nothing to it. It does not interact with Direct3D (though D3D is included in the project, just no D3D code is run).


So is it the laptop or VS or have I setup D3D wrong or something else?


(I posted this in General Programming because I don't think it is a D3D-specific problem)



Depth buffer isn't linear?

23 November 2013 - 10:40 PM

I've got a large environment that looks a bit like this:




I'm trying to see the depth buffer by using a pixel shader like this:

struct VS_OUTPUT
    float4 position : SV_Position;
    float2 texcoord : TEXCOORD;
    float3 normal : NORMAL;
    float4 worldposition : POSITION;

float4 main( in VS_OUTPUT input ) : SV_TARGET
    float shade = input.position.z / input.position.w;
    return float4(shade, shade, shade, 1.0f);

The result I get is this:





The depth appears to be very much non-linear. I was expecting something more along the lines of this:




I assume I'm calculating the depth wrong but every source I find says to do it this way.


Btw, I'm using D3D11 with  VS2012.


On a related note: Is there a way to see an image of the depth buffer within VS2012's graphics debugger (diagnostics)?




Instancing woes...

08 August 2013 - 11:42 PM

I'm attempting to do instancing with D3D11, but I'm getting this error:
D3D11 WARNING: ID3D11DeviceContext::DrawInstanced: Vertex Buffer at the input vertex slot 0 is not big enough for what the Draw*() call expects to traverse. This is OK, as reading off the end of the Buffer is defined to return 0. However the developer probably did not intend to make use of this behavior.  [ EXECUTION WARNING #356: DEVICE_DRAW_VERTEX_BUFFER_TOO_SMALL]
Input layout:

Layout[0].SemanticName = "POSITION";
Layout[0].SemanticIndex = 0;
Layout[0].Format = DXGI_FORMAT_R32G32B32_FLOAT;
Layout[0].InputSlot = 0;
Layout[0].AlignedByteOffset = 0;
Layout[0].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA;
Layout[0].InstanceDataStepRate = 0;

Layout[1].SemanticName = "TEXCOORD";
Layout[1].SemanticIndex = 0;
Layout[1].Format = DXGI_FORMAT_R32G32_FLOAT;
Layout[1].InputSlot = 0;
Layout[1].AlignedByteOffset = 12;
Layout[1].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA;
Layout[1].InstanceDataStepRate = 0;

Layout[2].SemanticName = "INSTANCEPOS";
Layout[2].SemanticIndex = 0;
Layout[2].Format = DXGI_FORMAT_R32G32B32_FLOAT;
Layout[2].InputSlot = 1;
Layout[2].AlignedByteOffset = 0;
Layout[2].InputSlotClass = D3D11_INPUT_PER_INSTANCE_DATA;
Layout[2].InstanceDataStepRate = 1;

Setting up the instance buffer:

// Set up the description of the instance buffer.
D3D11_BUFFER_DESC InstanceBufferDesc;
ZeroMemory(&InstanceBufferDesc, sizeof(InstanceBufferDesc));
InstanceBufferDesc.Usage = D3D11_USAGE_DEFAULT;
InstanceBufferDesc.ByteWidth = 60; // 5 instances
InstanceBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
InstanceBufferDesc.CPUAccessFlags = 0;
InstanceBufferDesc.MiscFlags = 0;
InstanceBufferDesc.StructureByteStride = 0;

// Create the instance buffer.
ID3D11Buffer* instancebuffer = nullptr;
HRESULT result = D3DDevice->CreateBuffer(&InstanceBufferDesc, nullptr, &instancebuffer);

Setting the mesh/buffers:

//instancesize = 12;
//instancedatasize = 60;
//D3DXVECTOR3 instancedata[5]; // lots of goodness in here

// set the vertex and index buffers
unsigned int strides[2] = {D3DMesh->VertexSize, instancesize};
unsigned int offsets[2] = {0, 0};
ID3D11Buffer* bufferPointers[2] = {D3DMesh->VertexBuffer, InstanceBuffer};

D3DContext->IASetVertexBuffers( 0, 2, bufferPointers, strides, offsets );
D3DContext->IASetIndexBuffer( D3DMesh->IndexBuffer, DXGI_FORMAT_R16_UINT, 0 );


//instancecount = 5
D3DContext->DrawInstanced(D3DMesh->IndexCount, instancecount, 0, 0);

Vertex shader input:

struct VS_INPUT
float3 position : POSITION;
float2 texcoord : TEXCOORD;
float3 objposition : INSTANCEPOS;

When I look at the instance buffer in the graphics debugger, it says that all of the values are 0 (I assume it's not updating properly). My mesh is just a square made up of 2 triangles, but the graphics debugger says that only a single triangle with a line poking out of it is being drawn (I can show a picture if it helps).

Any ideas to what's causing this?