The lifelong project marches on through yet more barren wastelands of absent knowledge where knowledge should be present. Hay ho though at least I'm alive and none of this will matter in 100 years anyway, because I'll be dead - like billions who've gone before me who I'm really no different from or better than whatsoever.
So without taking anything too seriously (you just can't with computers - and anything else that interfaces reality with dreams) I'll do the usual of posting some descriptive yet dis-organised stuff to set the scene followed by a numbered list of bulleted questions summing up what I'm trying to ask. Ok go:
My engine. Not mine really, someone else's I had the sense to buy and start to learn with a huge and nasty (for the intermediate level anyway) book that came with it. Got all the way through the stuff that encapsulates the API and also got it to run. Wow, never knew I had it in me. But it uses and old method of shader preparation, and old assembly style code in the shader file itself. Here's a demo:
vs.1.1
dcl_position0 v0
dcl_normal0 v3
dcl_texcoord0 v6
m4x4 oPos, v0, c0
mov oD0, c4
mov oT0, v6
Basic shader. Pretty obsolete method by now I understand. Also uses quite a complex way of loading and assembling the shader from a file to which I won't go into here. Anyway I was looking through a Direct X tutorial online and worked my way through the HLSL shader stuff there. Much easier to use, simply calls an DX function called D3DXCreateEffectFromFile(....). Just needs a pointer to an effect file ready before calling this function which is then passed as a parameter to said function.
So from there I'm thinking yeah ok that should be easy enough to change. Just goto the parts of the engine that do the loading and change them. Then goto the parts of the engine that loads the CPU calculated WorldViewProjection Matrix into the shader and change it to match the current protocol. Obviously take care of some class based scoping stuff to make sure everything can get access to everything. Fine.
But, as always here's the catch - the method by which the DX tutorial writer handles vertex defines and the way my engine's author handles them are different.
The engine's author uses FVF for the fixed function pipeline code path, and vertex declarations for the shader code path. Ok fine.
The DX tutorial author does not declare any kind of vertex define ahead of firing up the shader, niether vertex declarations nor FVF's. it appears to me looking at the HLSL code the tutorial has that the vertex declaration is somehow handled when the vertices arrive at the GPU. I don't like this method, and it would screw up alot of how my choice of engine works too. I'd either need a nasty hack method to get around it or a complete re-write of God alone knows how many inter-related functions. Bad.
In the tutorial code I also noticed that DirectX is fire up in the following way with these parameters:
d3d->CreateDevice(D3DADAPTER_DEFAULT,
D3DDEVTYPE_HAL,
hWnd,
D3DCREATE_SOFTWARE_VERTEXPROCESSING, // this bit is bothering me
&d3dpp,
&d3ddev);
I don't like the behaviour flag D3DCREATE_SOFTWARE_VERTEXPROCESSING. This looks bad, I thought anything imitated in software was bad, that's surely why the graphics card is there to avoid this with hardware. Out of interest here is the HLSL demo shader:
float4x4 World;
float4x4 View;
float4x4 Projection;
struct VertexOut
{
float4 Pos : POSITION;
float4 Color : COLOR;
};
VertexOut VShader(float4 Pos : POSITION)
{
VertexOut Vert = (VertexOut)0;
float4x4 Transform;
Transform = mul(World, View);
Transform = mul(Transform, Projection);
Vert.Pos = mul(Pos, Transform);
Vert.Color = float4(1, 1, 1, 1);
return Vert;
}
technique FirstTechnique
{
pass FirstPass
{
Lighting = FALSE;
ZEnable = TRUE;
VertexShader = compile vs_2_0 VShader();
}
}
I think if you can follow this thread this far you can see where this is going, so I'll summarise with questions below:
1) What exactly do I need to do to inform the hardware of the type of Vertex I am using when using HLSL shaders? Can I use a FVF format, or is it Vertex Declarations only? Can I still use the function SetVertexDeclaration(....) to inform the hardware of what type of vertex I'm using?
2) Following on from 1) is it possible to run HLSL stuff without defining a vertex format first through either FVF's or VD's? Not I'd want to anyway, I assume this on the fly idea is slower than pre-informing.
3) Is the HLSL method of using an effect pointer and the D3DXCreateEffectFromFile(....) function still the most flexible and widely accepted method of doing this? Is it ok for me to continue with it?
Other than that I'm ok with what's going on. I understand all the other basic shader stuff fine. Just need to use the SetMatrix(....) function instead of the SetVertexShaderConstant(....) function for sending stuff from the application to the shader which I assume is in the VRAM on the card.
That's it for now. Pretty long topic sorry
I'd be very grateful for any replies. In fact if anyone who takes the time to write a good reply, or just gives me some invaluable information I can offer remuneration in the form of small fishing tackle items. Our family sells stuff like this online. Swivels, hooks, plastic products pretty much whatever you want.. for free
Thanks so much