Sign in to follow this  
WebsiteWill

VertexFormats

Recommended Posts

When using Vertex Shaders, is there still a required order to how my vertex structure should be set up? I am using this public struct TransformedColoredMultiTextured { public float x; public float y; public float z; public float nx; public float ny; public float nz; public float tu0; public float tv0; public static readonly VertexFormats Format = VertexFormats.Position | VertexFormats.Normal | VertexFormats.Texture0; } but nothing is lit in my scene. Not sure if my normals are calculated incorrectly, my light is pointing in the wrong direction or if it's something to do wit the ordering of this structure. I just added code to calculate tru vertex normals for the mesh and added the nx,ny,nz etc code as well as set up my vertexElement like this VertexElement[] elements = new VertexElement[] { new VertexElement(0, 0, DeclarationType.Float3, DeclarationMethod.Default, DeclarationUsage.Position, 0), new VertexElement(0, 12, DeclarationType.Float3, DeclarationMethod.Default, DeclarationUsage.Normal, 0), new VertexElement(0, 24, DeclarationType.Float2, DeclarationMethod.Default, DeclarationUsage.TextureCoordinate, 0), VertexElement.VertexDeclarationEnd }; If there is a necessary ordering to position, normal, texture, etc does someone know where this information is? I found a site a long time ago that showed it for the fixed function pipeline but can't seem to locate it now... My scene is just a small terrain located around the origin X,Y being length width and Z being up (in world coordinates) My light direction is Vector4(1.0f, 1.0f, 1.0f, 1.0f) which should be pointing down on the terrain (I've also tried a -1 for the z value). My pixel shader is simply doing what is described in "Managed DirectX 9 Graphics and Game Programming" Thanks for any help, Webby

Share this post


Link to post
Share on other sites
I also based a bunch of code from the examples in that book, and I've had success.

Are you using the Effect class to do your shaders?

If you are, then the order of data in each vertex is totally controlled by you and can be stored in any order as long as your C#-side struct, your VertexElement array, and your HLSL-side struct all agree.

Even if you're not, you're using the same order that a fixed function system uses, AND you're specifying the VertexElement array properly, so I don't think that's your problem.

I'm rendering a Mesh object (yes, I'm a slacker), so my stuff is slightly different than yours.

Hmm, I just opened my project and a bunch of stuff doesn't compile. I think I got a different version of the Managed DirectX SDK installed than what I originally wrote my program for.

Share this post


Link to post
Share on other sites
As for data order... If you use an FVF, the data must be in the order FVFs have always wanted. If you use a declaration your data must be in the order and at the offsets specified in your declaration.

As for the shader, the order and number of inputs don't matter, as long as each element the shader needs is specified somewhere in the declaration (or FVF). The reason you're missing lighting is that YOU are responsible for lighting when using a vertex shader.

A vertex shader replaces the fixed pipeline functioning of:
lighting, normalizenormals, localviewer, colorvertex, diffusematerialsource, etc. (specularenable still operates)
vertex fog
texcoordindex (index selection AND texgen (ie: TSS_TCI_CAMERASPACENORMAL))
world, view, projection transforms
texture transforms

Share this post


Link to post
Share on other sites
Also make sure you're clearing the background with some strange color like bright blue or pink so that you can make sure that your triangles are at least on the screen at all. It's impossible to tell an empty black scene from a black scene with a black triangle painted in it.

If nothing's being drawn, then you probably aren't doing the vertex transformation stuff either.


--------
ARGH. Version breaking change in the Managed DirectX SDK and noone told me.

Share this post


Link to post
Share on other sites
That is C#, right? Have you considered just using a CustomVertex format? That looks like "CustomVertex.PositionNormalTextured". Since it is part of Managed Dx, you don't have to worry about setting it up yourself, ordering, etc. One less thing to worry about messing up.

Share this post


Link to post
Share on other sites
In regards to some of the questions.
I am using effects to do my shaders.
I am clearing to a blue color and can clearly see the actual mesh in filled and wireframe mode so I know it's rendering. However, it's completely black no matter where I point my light from.
There aren't CustomVertexes for the formats I will be using.

Here are the vertex and pixel shaders


//The world view and projection matrices
float4x4 WorldMatrix : WORLD;
float4 DiffuseDirection;
float4x4 WorldViewProj : WORLDVIEWPROJECTION;

Texture texture1;
Texture texture2;

sampler Sampler1 = sampler_state { texture = <texture1>;
mipfilter = LINEAR;
MinFilter = Linear;
MagFilter = Linear;
AddressU = Wrap;
AddressV = Wrap;
AddressW = Wrap;
MaxAnisotropy = 16; };
sampler Sampler2 = sampler_state { texture = <texture2>;
mipfilter = LINEAR;
MinFilter = Linear;
MagFilter = Linear;
AddressU = Wrap;
AddressV = Wrap;
AddressW = Wrap;
MaxAnisotropy = 16; };

struct VS_OUTPUT
{
float4 Pos : POSITION;
float2 TexCoord : TEXCOORD0;
float3 Light : TEXCOORD1;
float3 Normal : TEXCOORD2;
};

//Transform our coordinate into world space
VS_OUTPUT Transform(
in float4 inputPosition : POSITION,
in float3 inputNormal : NORMAL,
in float2 inputTexCoord : TEXCOORD0
)
{
//Declare our output structure
VS_OUTPUT Out = (VS_OUTPUT)0;
//Transform our position
Out.Pos = mul(inputPosition, WorldViewProj);
//Set our texture coordinates
Out.TexCoord = inputTexCoord;
//Store out light direction
Out.Light = DiffuseDirection;
//Normals are already in world space. Do I still need to transform them?
Out.Normal = inputNormal;

return Out;
}

float4 LightTextureColor(
float2 textureCoords : TEXCOORD0,
float3 lightDirection : TEXCOORD1,
float3 normal :TEXCOORD2) : COLOR0
{
//Get the texture color
float4 textureColor = tex2D(Sampler1, textureCoords);
//Make our diffuse color purple for now
float4 diffuseColor = {1.0f, 0.0f, 1.0f, 1.0f};

//Return the combined color after calculating the effect of
//the diffuse directional light
return textureColor * (diffuseColor * saturate(dot(lightDirection, normal)));
};




Thanks,
Webby

Share this post


Link to post
Share on other sites
Hrm.
It works fine up until this line of the pixel shader
return textureColor * (diffuseColor * saturate(dot(lightDirection, normal)));
then it goes black.

If I change this line to
return textureColor;
then I see my terrain with the texture evenly applied across it so I know my texture, texcoords and mesh is working properly. It's got to be the way I am converting my normals or something.


public void CalculateNormals()
{
CalculateFaceNormals();
CalculateVertexNormalsUsage();
}
public void CalculateVertexNormalsUsage()
{
int i;
for(i=0; i<_terrainVertexes.Count-1; i++)
{
TerrainVertex tv = _terrainVertexes[i];
tv._incident = 0;
tv._polyIndex = new System.Collections.Generic.List<int>();
foreach (TerrainFace face in _terrainFaces)
{
if (face._0 == i)
{
tv._polyIndex.Add(face._0);
tv._incident++;
}
if (face._1 == i)
{
tv._polyIndex.Add(face._1);
tv._incident++;
}
if (face._2 == i)
{
tv._polyIndex.Add(face._2);
tv._incident++;
}
}
_terrainVertexes[i] = tv;
}
}

private void CalculateVertexNormals()
{
int vertexNum;
for(vertexNum=0; vertexNum<_terrainVertexes.Count-1; vertexNum++)
{
TerrainVertex tv = _terrainVertexes[vertexNum];
tv._vertex.nx = 0.0f;
tv._vertex.ny = 0.0f;
tv._vertex.nz = 0.0f;
Vector3 sumVector = new Vector3();
int i;
for (i = 0; i < tv._incident - 1; i++)
{
sumVector.Add(new Vector3(_terrainFaces[tv._polyIndex[i]]._nx, _terrainFaces[tv._polyIndex[i]]._ny, _terrainFaces[tv._polyIndex[i]]._nz));
}
sumVector.X /= tv._incident;
sumVector.Y /= tv._incident;
sumVector.Z /= tv._incident;
sumVector.Normalize();
tv._vertex.nx = sumVector.X;
tv._vertex.ny = sumVector.Y;
tv._vertex.nz = sumVector.Z;
_terrainVertexes[vertexNum] = tv;
}
}

public void CalculateFaceNormals()
{
foreach(TerrainFace tf in _terrainFaces)
{
Vector3 tv0, tv1, tv2, dist1, dist2, result;
tv0 = new Vector3(_terrainVertexes[tf._0]._vertex.x, _terrainVertexes[tf._0]._vertex.y, _terrainVertexes[tf._0]._vertex.z);
tv1 = new Vector3(_terrainVertexes[tf._1]._vertex.x, _terrainVertexes[tf._1]._vertex.y, _terrainVertexes[tf._1]._vertex.z);
tv2 = new Vector3(_terrainVertexes[tf._2]._vertex.x, _terrainVertexes[tf._2]._vertex.y, _terrainVertexes[tf._2]._vertex.z);

dist1 = tv1 - tv0;
dist2 = tv2 - tv1;
result = Vector3.Normalize(Vector3.Cross(dist1, dist2));
tf._nx = result.X;
tf._ny = result.Y;
tf._nz = result.Z;
}
}



Thanks for having a look,
Webby

Share this post


Link to post
Share on other sites
Quote:
It works fine up until this line of the pixel shader
return textureColor * (diffuseColor * saturate(dot(lightDirection, normal)));
then it goes black.

Well, mathematically speaking, I'd have to assume that a term in your statement resolves to 0. Anything multiplied by 0 results in 0... so it'll follow through to returning 0 (which is black)...

The obvious part to check is what your saturate(dot(lightdirection,normal)) statement resolves to... if your normals are incorrect then you might well be always getting a "0" returned.

Are you sure your normals/lightdirection are actually unit length?

hth
Jack

Share this post


Link to post
Share on other sites
I don't suppose there is a way to trace through the shader code is there?
I suppose to can place some sort of conditionals in there to simply output a certain color per condition so that
if saturate(dot(lightDirection, normal)) == 0
then make everything bright red.

Is there an easier way though, to step through the actual shader? Possibly some third part application?

Thanks,
Webby

Share this post


Link to post
Share on other sites
Your normals are likely in model space, not world space, and need some transforming, but we can ignore that for now. (if you want to know, you should multiply by a 3x3 matrix that's the transpose of the inverse of your world transform. You can usually get away with a 3x3 matrix of just world, or using your 4x4 matrix and using (normal.x, normal.y, normal.z, 0))

If your shader is vs_1_1 and ps_1_1, outputs/inputs are clamped between 0 and 1 unless used as texture coordinates. A normal of 0,0,-1 will become 0,0,0. If you're using 1.1, you need to scale and bias your normals, like this

in vs
OUT.normal = (worldnormal*0.5)+0.5;

in ps
float3 normal = (IN.normal * 2) - 1;

You'll also need this for your light vector.

Also, all shader constants for the pixel shader are clamped between 0 and 1 too. If your light vector was a constant to your pixel shader, you'd need to do the scale/bias in C++/C# before programming the constant, and again unpack it in the pixel shader as shown above.

Share this post


Link to post
Share on other sites
Using VS and PS version 2.0 on an ATI 9800 pro so I don't think the scale and bias is the issue. It could be a model space issue.

The vertex coordinates AFAIK are in world space and I am calculating my normals from those values which leads me to believe that the normals are in world space, unless I am missing a transformation.

Thanks,
Webby

Share this post


Link to post
Share on other sites
Quote:
Original post by WebsiteWill
I don't suppose there is a way to trace through the shader code is there?
I suppose to can place some sort of conditionals in there to simply output a certain color per condition so that
if saturate(dot(lightDirection, normal)) == 0
then make everything bright red.

Is there an easier way though, to step through the actual shader? Possibly some third part application?

If you have Visual Studio .NET 2003, Windows XP Pro, and have chosen to install the DirectX Extensions for Visual Studio during SDK installation, you can debug shaders like normal applications (place breakpoints, inspect values, ...etc)

Share this post


Link to post
Share on other sites
Hrm. Actually using Visual Studio 2005 Beta 2 :)
I've been waiting a looooong time for generics :)

I think I did install the extensions. Haven't actually tried a breakpoint in the .fx file. Didn't think it would even hae a shot of stopping there.

Cool, something else to try when I get home!
Hurry up 4:30!

Webby

Share this post


Link to post
Share on other sites
<self defeating rant on>You know, after going through the trouble of writing a function to calculate vertex normals from the face normals of a mesh, you would think it wise to ACTUALLY CALL THAT FUNCTION! <self defeating rant off>

Most of the problem is now solved. Now there is a small issue somewhere with my actual calculations as the terrain is not lit properly. Random triangles are shadowed. Can anyone see any blatant mistakes with this code? Or even have pointers for efficiency?

public void CalculateVertexNormalsUsage()
{
int i;
for(i=0; i<_terrainVertexes.Count-1; i++)
{
TerrainVertex tv = _terrainVertexes[i];
tv._incident = 0;
tv._polyIndex = new System.Collections.Generic.List<int>();
foreach (TerrainFace face in _terrainFaces)
{
if (face._0 == i)
{
tv._polyIndex.Add(face._0);
tv._incident++;
}
if (face._1 == i)
{
tv._polyIndex.Add(face._1);
tv._incident++;
}
if (face._2 == i)
{
tv._polyIndex.Add(face._2);
tv._incident++;
}
}
_terrainVertexes[i] = tv;
}
}

private void CalculateVertexNormals()
{
int vertexNum;
for(vertexNum=0; vertexNum<_terrainVertexes.Count-1; vertexNum++)
{
TerrainVertex tv = _terrainVertexes[vertexNum];
tv._vertex.nx = 0.0f;
tv._vertex.ny = 0.0f;
tv._vertex.nz = 0.0f;

Vector3 sumVector = new Vector3();
int i;
for (i = 0; i < tv._incident - 1; i++)
{
sumVector.Add(new Vector3(_terrainFaces[tv._polyIndex[i]]._nx, _terrainFaces[tv._polyIndex[i]]._ny, _terrainFaces[tv._polyIndex[i]]._nz));
}
sumVector.X /= tv._incident;
sumVector.Y /= tv._incident;
sumVector.Z /= tv._incident;
sumVector.Normalize();
tv._vertex.nx = sumVector.X;
tv._vertex.ny = sumVector.Y;
tv._vertex.nz = sumVector.Z;

_terrainVertexes[vertexNum] = tv;
}
}

public void CalculateFaceNormals()
{
foreach(TerrainFace tf in _terrainFaces)
{
Vector3 tv0, tv1, tv2, dist1, dist2, result;
tv0 = new Vector3(_terrainVertexes[tf._0 - 1]._vertex.x, _terrainVertexes[tf._0 - 1]._vertex.y, _terrainVertexes[tf._0 - 1]._vertex.z);
tv1 = new Vector3(_terrainVertexes[tf._1 - 1]._vertex.x, _terrainVertexes[tf._1 - 1]._vertex.y, _terrainVertexes[tf._1 - 1]._vertex.z);
tv2 = new Vector3(_terrainVertexes[tf._2 - 1]._vertex.x, _terrainVertexes[tf._2 - 1]._vertex.y, _terrainVertexes[tf._2 - 1]._vertex.z);

dist1 = tv1 - tv0;
dist2 = tv2 - tv1;
result = Vector3.Normalize(Vector3.Cross(dist1, dist2));
tf._nx = result.X;
tf._ny = result.Y;
tf._nz = result.Z;
}
}



Thanks,
Webby

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this