Archived

This topic is now archived and is closed to further replies.

pawelp

VertexTweening

Recommended Posts

How can I do VertexTweening? I have tow 3DModels (or better two versions of the same model) and want to simulate a moving? How can I tell DirectX to Tween betwen them? (I use Delphi but I will be also heppy about some C Code, so far it will keep simple...)

Share this post


Link to post
Share on other sites
I have some BCB code I can post when I get home if that's OK......

Neil

PS it uses Vertex Shaders if you're comfortable with that....

WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

[edited by - thedo on March 21, 2002 8:46:19 AM]

Share this post


Link to post
Share on other sites
If you just could add some comment s that woud be great!

My knowledge about Vertex Shaders isn''t the best, but I hope it will be enought

Share this post


Link to post
Share on other sites
Heres the vertex shader (The hardest part)

//Vertex shader for tweening between 2 models
;-----------------------------------------------------------------------------
; Note that my MD2 models have no normals so need no lighting calculations
; v0 Vertex1
; v1 Vertex2
; v3 color1
; v4 color2
; v6 texturecoord1
; v7 texturecoord2
;
; r0 temp vertex 1
; r1 temp vertex 2
;
; c0 tween value
; c1-c4 world matrix
; c6-c9 projection-view matrix
;
; oPos Output vertex
; oTn texture out
;
;-----------------------------------------------------------------------------
vs.1.1

//take the first vertex from the second vertex
mov r0,v0
add r1,v1,-r0

//Multliply the result by the tween factor
mul r2,c0,r1
//mov r0,v0
//Add the result to the first vertex
add r0,v0,r2

//multiply result by matrix to get final output
dp4 r1.x, r0, c1
dp4 r1.y, r0, c2
dp4 r1.z, r0, c3
dp4 r1.w, r0, c4


dp4 oPos.x, r1, c6
dp4 oPos.y, r1, c7
dp4 oPos.z, r1, c8
dp4 oPos.w, r1, c9

//copy texture coords and color
mov oT0,v6
mov oD1,v3


This is the Shader Declaration

DWORD decl[] =
{
D3DVSD_STREAM( 0 ),
D3DVSD_REG( 0, D3DVSDT_FLOAT3 ), // Position of first mesh
D3DVSD_REG( 3, D3DVSDT_D3DCOLOR ),
D3DVSD_REG( 6, D3DVSDT_FLOAT2 ),
D3DVSD_STREAM( 1 ),
D3DVSD_REG( 1, D3DVSDT_FLOAT3 ), // Position of second mesh
D3DVSD_REG( 4, D3DVSDT_D3DCOLOR ),
D3DVSD_REG( 7, D3DVSDT_FLOAT2 ),
D3DVSD_END()
};

Heres the rendering code

D3DXMATRIX mat;
D3DXMatrixMultiply( & mat, & view, & proj );
D3DXMatrixTranspose( & mat, & mat );
float v[4]={part,part,part,1};
Device->SetVertexShaderConstant( 0, &v, 1 );
Device->SetVertexShaderConstant( 1, &matWorldTranspose, 4 );
Device->SetVertexShaderConstant( 6, &mat,4);
Device->SetVertexShader (handle);
Device->SetStreamSource(0,Frames[(int)pFrame],sizeof(MODELVERTEX));
Device->SetStreamSource(1,Frames[(int)pFrame+1],sizeof(MODELVERTEX));
Device->DrawPrimitive(D3DPT_TRIANGLELIST,0,(m_triangles));

And finally the FVF I''m using is

// The vertex we use for D3D
struct MODELVERTEX
{
D3DXVECTOR3 m_vecPos; //Position
D3DCOLOR m_dwDiffuse; //Color
D3DXVECTOR2 m_vecTex; //texturecoordinates
};

//This is the definition for the vertex declared above (FVF=flexible vertex format)
#define D3DFVF_MODELVERTEX ( D3DFVF_XYZ | D3DFVF_DIFFUSE | D3DFVF_TEX1 )

Any Questions Post them or e-mail me and I''ll see what I can do.

Neil

PS I cant vouch that this is the BEST WAY of doing it but it works pretty well.

Neil

WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

Share this post


Link to post
Share on other sites
ehm.. yes .. there are some questions.

1.
At first the code above the Shader declaration ("DWORD decl[] =
") looks a bit like assembler (mo,add..) is this something that I have to do or is it just a description of whot is done by irectX?

2.
The Shader Declaration:
I''m not so familiar wih C, so I have to interpret the decl first, correct me if I''m wrong.
decl is the name of a array of type dword, right?
And where is it used?

3.
Render Code:
//Why do you multiply the View and the Projection Matrix?
D3DXMatrixMultiply( & mat, & view, & proj );

//Why Transpose?
D3DXMatrixTranspose( & mat, & mat );

//what is part?
float v[4]={part,part,part,1};

//Whhat dose it do?
Device->SetVertexShaderConstant( 0, &v, 1 );
Device->SetVertexShaderConstant( 1, &matWorldTranspose, 4 );
Device->SetVertexShaderConstant( 6, &mat,4);

//handle will be the FVF format definition, right?
Device->SetVertexShader (handle);

// The "Frames[(int)pFrame]" will be a array with your VertexBuffer?
//So we set here the two models to tween between, but from where DirectX koows how far it shat tween(maybe the SetVertexShaderConstant)?
Device->SetStreamSource(0,Frames[(int)pFrame],sizeof(MODELVERTEX));
Device->SetStreamSource(1,Frames[(int)pFrame+1],sizeof(MODELVERTEX));

Device->DrawPrimitive(D3DPT_TRIANGLELIST,0,(m_triangles));



I hope the questions don''t sound to stupid
Thanx for the code.

Share this post


Link to post
Share on other sites
1: It is D3D Vertex shader assembler. Look in the DX SDK as to what each instruction does.

2: Yes partyly my fault I missed a bit of code. It is an array.
When you assemble your VS code (in an external file)like this:

D3DXAssembleShaderFromFile( "c:\\Tween.vsh", 0, NULL,&vsFunction, NULL );
Device->CreateVertexShader( decl, (DWORD*)vsFunction->GetBufferPointer(), &handle, D3DUSAGE_SOFTWAREPROCESSING );
You create a Handle to the shader (as used in Question3)

3: In the SDK Docs it explains all. VS only used transposed combined view and Projection matrices. Don't know WHY. Just do it.

How does it know how far to tween - well that is the part variable. Passed to the VS in c0.

Have a look at the Docs and see what you can get. I figured it out from that and using this place. Post back when you have had a look at the docs. VS coe is nt trivial BUT it's not too hard either. The SDK teaches it better than I can.

I will post a Complete SIMPLE VS later for you to look at.

Neil

WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

[edited by - thedo on March 21, 2002 12:43:20 PM]

Share this post


Link to post
Share on other sites
Okey, I read now some bit about the VertexShader (I spend the half last night for it...), and I think that I start to understand... can I have the promised sample ?

Share this post


Link to post
Share on other sites
Oops yeah almost forgot. I''m at Uni right now so I will post it later. PROMISE.....

Neil

WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

Share this post


Link to post
Share on other sites
OK First thing is first. Create a text file and name it basic.vsh

copy this into it.

vs.1.1


//multiply result by matrix to get final output
dp4 r1.x, r0, v0.x
dp4 r1.y, r0, v0.y
dp4 r1.z, r0, v0.z
dp4 r1.w, r0, v0.w

dp4 oPos.x, r1, c6
dp4 oPos.y, r1, c7
dp4 oPos.z, r1, c8
dp4 oPos.w, r1, c9

This is just the most basic VS I can think of - it acts just like the standard T+L (but withut any lighting).

In your code

DWORD handle;
LPD3DXBUFFER vsFunction;

DWORD decl[] =
{
D3DVSD_STREAM( 0 ),
D3DVSD_REG( 0, D3DVSDT_FLOAT3), //Pass only vertex to shader
D3DVSD_END()
};

D3DXAssembleShaderFromFile( "basic.vsh", 0, NULL,&vsFunction, NULL );
Device->CreateVertexShader( decl, (DWORD*)vsFunction->GetBufferPointer(), &handle, D3DUSAGE_SOFTWAREPROCESSIND3DXMATRIX mat;
D3DXMatrixMultiply( & mat, & view, & proj );
D3DXMatrixTranspose( & mat, & mat );
Device->SetVertexShaderConstant( 1, &matWorldTranspose, 4 );
Device->SetVertexShaderConstant( 6, &mat,4);
Device->SetVertexShader (handle);
Device->SetStreamSource(0,Frames[(int)pFrame],sizeof(MODELVERTEX));
Device->DrawPrimitive(D3DPT_TRIANGLELIST,0,(m_triangles));

This is the FVF

struct MODELVERTEX
{
D3DXVECTOR3 m_vecPos; //Position
};

//This is the definition for the vertex declared above (FVF=flexible vertex format)
#define D3DFVF_MODELVERTEX ( D3DFVF_XYZ )

Hope this helps

Neil







WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

Share this post


Link to post
Share on other sites
Is it necceserry to call D3DXAssembleShaderFromFile each frame? Or is it enought to call it once, create a handle to it and than use it in the render procedure?

Ot the weekend I get an ezample where the whole tweening translation is done manualty like:

TriangleToRender.x := Triangle[FirstFrame] + TimeKey * (Triangle[NextFrame].x-Triangle[FirstFrame].x);

This method have the great disadvantege that you have to go throught all the Triangles in a model and throught x,y,z, normals and textureCoordinates off all them...
It isn''t right slow but I''m asking myself if the VertexShaders are faster?

Share this post


Link to post
Share on other sites
No - call D3DXAssembleShaderFromFile is called once at load time.

Vertex shaders **should** give you a speed boost. Why? Well I assume you have a Duron/Athlon/P3/P4. The Delphi compiler only compiles to Pentium 1 standard (possibly MMX too). The vertex shaders will either run in Hardware (Geforce 3/4, Radeon 8500) or use 3DNOW/SSE/SSE2 instruction set for optimization. Each line of VS code is roughly one processor cycle (aside from rcp), if you debug your code in Delphi you will probably see your code is much larger than the VS code.

Neil

PS Any more Q''s you know where to find me.

Oh and why 3 keyframes? Prevents distortion of the mesh. If it only used 2 meshes the output would look distorted between the keyframes. in my example I use MD2 models. The keyframes are close enought to not really see any distortion.


WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

Share this post


Link to post
Share on other sites
Okey,, I think I slowly have it, thanx.
My Quake models have also normals and I tried to implement that in the VertexShader, but it''s nt going so well as I thougth, how should the vertex shader file and the vertex Declaration looks like if my model is make of vertexes of the following type:

TD3DVertex = record
x: Single; (* Homogeneous coordinates *)
y: Single;
z: Single;
nx: Single; (* Normal *)
ny: Single;
nz: Single;
tu: Single; (* Texture coordinates *)
tv: Single;
end;
?

Why do you decide not to use normals for your quake models?

BTW do you know what are the GICommands in the Quake md2 files?

Share this post


Link to post
Share on other sites
I didn''t use normals for simplicities sake for now. However...

I wouldn''t other interpolating those any way mid-frame. The effect would only be negligible (notice how I don''t interpolate my texture coordinates?) and would run slower.

The decl

DWORD decl[] =
{
D3DVSD_STREAM( 0 ),
D3DVSD_REG( 0, D3DVSDT_FLOAT3 ), // Position of first mesh
D3DVSD_REG( 3, D3DVSDT_FLOAT3 ), //Normal
D3DVSD_REG( 6, D3DVSDT_FLOAT2 ), //Texcoord.
D3DVSD_STREAM( 1 ),
D3DVSD_REG( 1, D3DVSDT_FLOAT3 ), // Position of second mesh
D3DVSD_REG( 4, D3DVSDT_FLOAT3 ),
D3DVSD_REG( 7, D3DVSDT_FLOAT2 ),
D3DVSD_END()
};

I might attempt some vs code later fo you (but I am quite busy with assignment deadlines so if I don''t do it tonight I haven''t forgotten).

Neil


WHATCHA GONNA DO WHEN THE LARGEST ARMS IN THE WORLD RUN WILD ON YOU?!?!

Share this post


Link to post
Share on other sites