After calling .Apply() on the pass, the vertex and pixel shaders are still null.
If I set them directly from the pass information, they are still null.
I've looked through about a dozen different google searches and found nothing that helps.
{
if (defaultEffect.GetTechniqueByName("Shader").GetPassByIndex(0).Apply(context).IsFailure)
throw new Exception("Failure on pass apply.");
// Sets them directly from the pass information.
// Notably, these pixel and vertex shader variables are also null.
// There is only one technique, so it shouldn't matter how I access it.
// context.PixelShader.Set(defaultEffect.GetTechniqueByIndex(0).GetPassByIndex(0).PixelShaderDescription.Variable.GetPixelShader(0));
// context.VertexShader.Set(defaultEffect.GetTechniqueByIndex(0).GetPassByIndex(0).VertexShaderDescription.Variable.GetVertexShader(0));
if (object.Equals(context.PixelShader.Get(), null))
throw new NullReferenceException(); // Errors out here.
if (object.Equals(context.VertexShader.Get(), null))
throw new NullReferenceException(); // Also errors out here, if I put it first.
// defaultEffect.GetTechniqueByName("Shader").GetPassByIndex(0).Description.
context.InputAssembler.InputLayout = inputLayout;
context.InputAssembler.PrimitiveTopology = topology;
context.InputAssembler.SetIndexBuffer(indexBuffer, SlimDX.DXGI.Format.R16_SInt, 0);
context.InputAssembler.SetVertexBuffers(0, new SlimDX.Direct3D11.VertexBufferBinding(vertexBuffer, vertexStride, 0));
// Complains about trying to access protected memory, if I don't use the null checks.
// Also, my try-catch-finally block doesn't catch this exception, which is part of the reason I'm doing null checks -
// they *are* caught by my try-catch-finally block.
context.DrawIndexed(indexBuffer.Description.SizeInBytes / indexStride, 0, 0);
}
Does this happen with a trivial shader? You need to reduce the code to the smallest size possible to debug. Can we see the rest of the code (i.e. how you create your Effect and your index/vertex buffers)?
Also, pretty sure R16_SInt isn't a valid index format but that's not the issue (yet).
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
The effect is loaded here:
using (
var byteCode = SlimDX.D3DCompiler.ShaderBytecode.Compile(
Properties.Resources.Shader11,
"fx_5_0",
SlimDX.D3DCompiler.ShaderFlags.Debug,
SlimDX.D3DCompiler.EffectFlags.None
)
)
defaultEffect = new SlimDX.Direct3D11.Effect(dxDevice, byteCode);
Vertex and index buffers in the Mesh.Reload function:
this.Dispose();
bool success = false;
try
{
vertexLayout = new SlimDX.Direct3D11.InputLayout(
(SlimDX.Direct3D11.Device)dxDevice,
dxDevice.DefaultShaderSignature,
Vertex.D3D11Elements
);
success = true;
}
catch (Exception e)
{
Logging.DefaultLog.Log(e);
// throw e;
}
finally
{
if (!success)
{
// SafeDispose just does a check to make sure its not null before calling .Dispose()
Utility.SafeDispose(vertexLayout);
Utility.SafeDispose(indexBuffer);
Utility.SafeDispose(vertexBuffer);
}
}
The Mesh.Dispose() function body might be useful too, just to re-assure you there's nothing potentially being disposed that isn't re-created:
Utility.SafeDispose(vertexLayout);
Utility.SafeDispose(indexBuffer);
Utility.SafeDispose(vertexBuffer);
I can't see anything obvious. My best guess is the shader file you have in Properties.Resources.Shader11 is not in the format expected by the shader bytecode compiler (what format is it? ascii string?) Download DebugView and turn on the debug runtime - it should tell you exactly what is wrong very verbosely. Download the DirectX SDK as well if you haven't, since you need it for the debug runtime.
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
I can't see anything obvious. My best guess is the shader file you have in Properties.Resources.Shader11 is not in the format expected by the shader bytecode compiler (what format is it? ascii string?) Download DebugView and turn on the debug runtime - it should tell you exactly what is wrong very verbosely. Download the DirectX SDK as well if you haven't, since you need it for the debug runtime.
Went into the DirectX control panel, turned Debug "Force on" for the debug .exe of my test program, opened up DebugView, ran it both from VS and from Windows - No output whatsoever.
Oh yeah, shader file format - I save it as ANSI in Notepad; Unicode, Unicode big endian and UTF-8 all glitch on the first character on ShaderByteCode.Compile(). Visual Studios' built-in Resource handling turns it into the byte array you see actually being referenced by the compiler.
Went into the DirectX control panel, turned Debug "Force on" for the debug .exe of my test program, opened up DebugView, ran it both from VS and from Windows - No output whatsoever.
Is there an additional step or steps I need?
[/quote]
Enable all output streams in DebugView (i.e. Capture->Capture Win32, Capture global win32, and capture kernel). Create your device with the debug flag (although that shouldn't be necessary if the DX control panel is set to force debug, but I have had some problems in the past). Even if your program was working it would print out something, so try and get that working. It's very useful in every situation so it's well worth the time spent setting it up, it helps you spot stuff you would have taken a long time to discover i.e. wrong stride, that kind of subtle error).
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”
RtlVolumeDeviceToDosName might have something to do with it, or does it happen randomly? It may actually be coming from the IDE doing some housekeeping, as well.
Can you try putting your shader into a file, and using the CompileFromFile() method instead? Then we can compare what happens, to gain some insight into the problem. (if it works, then it's a problem with the byte array or with the Compile() method, and if it doesn't work we need to keep looking).
“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”