Jump to content
  • Advertisement
Sign in to follow this  
Omnicrash

[DX9/SlimDX] Hardware instancing problems

This topic is 2560 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Alright, I've been at this for DAYS now, I cannot for the life of me figure out what is wrong with my code. This is a port from XNA, and there it worked perfectly.
I'm trying to do hardware instancing, but nothing ever shows up. I've been comparing my code to working examples over and over. Regular drawing works fine, but if I even try to render 1 instance using the exact same world matrix, nothing shows up.

This is based on a port of an XNA sample.

Drawing code:
/// <summary>
/// Draws the mesh multiple times using the instancing transforms and the specified camera matrices.
/// </summary>
public void Draw(ref Matrix[] world, int instanceCount, Matrix view, Matrix projection)
{
Device device = Globals.GraphicsDevice;

/*
if (!instanced)
throw new InvalidOperationException("The model '" + filename + "' has not been prepared for instancing.");
*/
//TODO: test and fix
/*
if (world.Length > maxInstances)
throw new DeviceNotSupportedException("Too many instances to draw in mesh.");
*/

// Get effect & set global (model) parameters
Effects.MeshEffect effect = Globals.Render.MeshEffect;
effect.SetTechnique(Effects.MeshEffect.Techniques.HardwareInstancing);
effect.View = view;
effect.Projection = projection;

if (Material == null)
{
effect.TextureEnabled = false;
}
else
{
effect.TextureEnabled = true;
effect.Texture = Material.DiffuseMap;
}

// Make sure our instance data vertex buffer is big enough.
int transformDataSize = 64 * instanceCount; // Using the size of a Matrix

if (instanceTransformStreamSize < transformDataSize)
{
if (instanceTransformStream != null)
instanceTransformStream.Dispose();

instanceTransformStream = new VertexBuffer(device, transformDataSize,
Usage.WriteOnly | Usage.Dynamic, VertexFormat.Position,
Pool.Default);
instanceTransformStreamSize = transformDataSize;
}

// Upload transform matrices to the instance data vertex buffer.
instanceTransformStream.Lock(0, 0, LockFlags.Discard).WriteRange(world, 0, instanceCount);
instanceTransformStream.Unlock();
/*
instanceTransformStream.SetData(
world, 0,
instanceCount,
SetDataOptions.Discard);
*/
device.VertexDeclaration = vertexDeclarationInstancing;

// Mesh
device.SetStreamSource(0, VertexBuffer, 0, VertexStride);
device.SetStreamSourceFrequency(0, instanceCount, StreamSource.IndexedData);

// Transformation matrices
device.SetStreamSource(1, instanceTransformStream, 0, 64); // Size of a Matrix
device.SetStreamSourceFrequency(1, 1, StreamSource.InstanceData);

device.Indices = IndexBuffer;

effect.Begin();

// Draw all the instances in a single batch.
device.DrawIndexedPrimitives(
PrimitiveType,
0, 0, VertexCount,
0, PrimitiveCount);

effect.End();

// Reset the instancing streams.
device.SetStreamSource(0, null, 0, 0);
device.SetStreamSource(1, null, 0, 0);

}



Shader code:
// Used when not using instancing
float4x4 World;

// Camera settings.
float4x4 View;
float4x4 Projection;

// This sample uses a simple Lambert lighting model.
float3 LightDirection = normalize(float3(1.0, 1.0, 1.0 ));
float3 DiffuseLight = 1.25;
float3 AmbientLight = 0.25;

bool TextureEnabled = false;
texture Texture;
sampler Sampler = sampler_state
{
Texture = (Texture);

MinFilter = None;
MagFilter = None;
MipFilter = None;

AddressU = Wrap;
AddressV = Wrap;
};



struct VertexShaderInput
{
float4 Position : POSITION0;
float3 Normal : NORMAL0;
float2 TextureCoordinate : TEXCOORD0;
};

struct VertexShaderOutput
{
float4 Position : POSITION0;
float4 Color : COLOR0;
float2 TextureCoordinate : TEXCOORD0;
};


VertexShaderOutput VertexShaderCommon(VertexShaderInput input,
float4x4 instanceTransform)
{
VertexShaderOutput output;

// Apply the world and camera matrices to compute the output position.
float4 worldPosition = mul(input.Position, instanceTransform);
float4 viewPosition = mul(worldPosition, View);
output.Position = mul(viewPosition, Projection);

// Compute lighting, using a simple Lambert model.
float3 worldNormal = mul(input.Normal, instanceTransform);

float diffuseAmount = max(-dot(worldNormal, LightDirection), 0);

float3 lightingResult = saturate(diffuseAmount * DiffuseLight + AmbientLight);

output.Color = float4(lightingResult, 1);

// Copy across the input texture coordinate.
output.TextureCoordinate = input.TextureCoordinate;

return output;
}

VertexShaderOutput NoInstancingVertexShader(VertexShaderInput input)
{
return VertexShaderCommon(input, World);
}

VertexShaderOutput HardwareInstancingVertexShader(VertexShaderInput input, float4x4 instanceTransform : TEXCOORD1)
{
return VertexShaderCommon(input, instanceTransform);
}


float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
if (TextureEnabled)
return tex2D(Sampler, input.TextureCoordinate) * input.Color;
else
return input.Color;
}


technique NoInstancing
{
pass Pass1
{
VertexShader = compile vs_3_0 NoInstancingVertexShader();
PixelShader = compile ps_3_0 PixelShaderFunction();
}
}

technique HardwareInstancing
{
pass Pass1
{
VertexShader = compile vs_3_0 HardwareInstancingVertexShader();
PixelShader = compile ps_3_0 PixelShaderFunction();
}
}



Vertex declaration (extension) code:
protected virtual VertexDeclaration CreateInstancingDeclaration(VertexDeclaration vertexDeclaration)
{
// When using hardware instancing, the instance transform matrix is
// specified using a second vertex stream that provides 4x4 matrices
// in texture coordinate channels 1 to 4. We must modify our vertex
// declaration to include these channels.
VertexElement[] extraElements = new VertexElement[4];

short offset = 0;
byte usageIndex = 1;
short stream = 1;

for (int i = 0; i < 4; i++)
{
extraElements = new VertexElement(stream, offset,
DeclarationType.Float4,
DeclarationMethod.Default,
DeclarationUsage.TextureCoordinate,
usageIndex);

offset += 16; // Vector4 size
usageIndex++;
}

// Extend vertex declaration
VertexElement[] vertexElements = vertexDeclaration.Elements;

// Append the new elements to the original format.
int length = vertexElements.Length + extraElements.Length;
VertexElement[] vertexElementsInstanced = new VertexElement[length];
vertexElements.CopyTo(vertexElementsInstanced, 0);
extraElements.CopyTo(vertexElementsInstanced, vertexElements.Length);

// Create a new vertex declaration.
return new VertexDeclaration(Globals.GraphicsDevice, vertexElementsInstanced);
}



Begin and EndScene is called somewhere else in the project, effect.Begin() and End() calls up a custom class which starts/ends the current technique and its only pass.

Also attached are 2 screenshots of PIX debug captures.

One of the original working XNA project:
[attachment=6370:xna_working.png]


...and one of the current not working SlimDX version:
[attachment=6371:slimdx_not_working.png]


Any help is very much appreciated.

Share this post


Link to post
Share on other sites
Advertisement
From the PIX output, it looks like whatever's going wrong is happening during the vertex shader. The instance transform parameter doesn't have a semantic applied; perhaps that's the issue?

Share this post


Link to post
Share on other sites
I assume you are talking about the common vertex shader, the semantic is applied in the line:
VertexShaderOutput HardwareInstancingVertexShader(VertexShaderInput input, float4x4 instanceTransform : TEXCOORD1)

though.

Unless I'm missing something?

Share this post


Link to post
Share on other sites
Oh, I see. I saw your first vertex shader function and assumed that was the actual shader. In that case, nothing seems off to me. I'd try stepping through the inputs and outputs of the vertex shader in PIX and see where things go wrong.

Share this post


Link to post
Share on other sites
I can only get the assembly view to work for debugging a shader. I'll search some more to find if it is possible to do hlsl source debugging for C# applications.
But the thing I find the strangest, in the PreVS window, the extra data isn't shown while in the working XNA implementation it is.

EDIT: As far as I can tell, it seems that none of the extra (instanced) data is taken into account when processing. Any ideas on why this might be?

Share this post


Link to post
Share on other sites
I think your CreateInstancingDeclaration function is the issue. The statement vertexDeclaration.Elements will return also the end marker (VertexElement.VertexDeclarationEnd, which corresponds to D3DDECL_END) which you then copy to your new declaration, too. So right in the middle of your declaration you have this end marker, therefore the later elements will be ignored.

Share this post


Link to post
Share on other sites

I think your CreateInstancingDeclaration function is the issue. The statement vertexDeclaration.Elements will return also the end marker (VertexElement.VertexDeclarationEnd, which corresponds to D3DDECL_END) which you then copy to your new declaration, too. So right in the middle of your declaration you have this end marker, therefore the later elements will be ignored.


Yes! That fixed it!

Thank you so much, I probably never would have figured this one out on my own.
I didn't even know there was a VertexDeclarationEnd element that was automatically added.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!