Continuing from this topic (don't want to be necroposting or anything so I opened a new one just in case), I tried to apply what you guys (thank you very much) told me to be the best way to render a *3D rectangle bill-boarding towards the camera *while *only rotating on his length axis, defined by two fixed points in the space.* This way I'd be able to draw 3D sprites, particles, beams and stuff like that with ease.

From what I understood, I had to:

- take the directional vector (A) of the distance between the two points (I took p2-p1)

- take a directional vector (B) that goes from the camera to the line the points form (I took camera-p1)

- build three vectors: normalized A which I call X, normalized cross product (AxB) which I call Z, normalized cross product (ZxB) which I call Y, then rebuild the second vector with cross product (XxY) that still gets me Z

- the 3x3 matrix that I get out of these three vectors (X, Z, Y) is the orientation matrix that I need to apply to the rectangle in order for it to face the camera

- since DirectX matrices are 4x4, I assume to put in the values manually, so I use the orientation matrix as the first 3x3 values of the 4x4 one, adding the last row as the rectangle world position - just to be sure, I first initialize the 4x4 matrix as an identity

Summed up, I wrote this function:

D3DXMATRIX OrientationMatrix(D3DXVECTOR3 A, D3DXVECTOR3 B, D3DXVECTOR3 pos) { D3DXMATRIX m; D3DXVECTOR3 X, Y, Z; D3DXMatrixIdentity(&m); D3DXVec3Normalize(&X, &A); // x D3DXVec3Cross(&Z,&A,&B); D3DXVec3Normalize(&Z, &Z); // z D3DXVec3Cross(&Y,&Z,&B); D3DXVec3Normalize(&Y, &Y); // y D3DXVec3Cross(&Z,&X,&Y); D3DXVec3Normalize(&Z, &Z); // z-2 m._11=X.x; m._12=X.y; m._13=X.z; m._21=Z.x; m._22=Z.y; m._23=Z.z; m._31=Y.x; m._32=Y.y; m._33=Y.z; m._41=pos.x; m._42=pos.y; m._43=pos.z; return m; }

And then I build and render the rectangle this way:

void BBStrip(D3DXVECTOR3 p1, D3DXVECTOR3 p2, D3DXVECTOR3 camera, float thickness, LPDIRECT3DTEXTURE9 texture, LPDIRECT3DDEVICE9 device) { device->SetTexture(0, texture); device->CreateVertexBuffer(4 * sizeof(TVERTEX), NULL, TVERTEX::FVF, D3DPOOL_DEFAULT, &pVBuffer, NULL); device->SetStreamSource(0, pVBuffer, 0, sizeof(TVERTEX)); device->SetFVF(TVERTEX::FVF); D3DXMATRIX w; w = OrientationMatrix(p2-p1, camera-p1, p1); //D3DXMatrixIdentity(&w); pVBuffer->Lock(0, 0, (void**)&vert, 0); vert[0]=TVERTEX::TVERTEX(abs(p2.x-p1.x),thickness*0.5,0,0,0); vert[1]=TVERTEX::TVERTEX(abs(p2.x-p1.x),-thickness*0.5,0,1,0); vert[2]=TVERTEX::TVERTEX(0,-thickness*0.5,0,1,1); vert[3]=TVERTEX::TVERTEX(0,thickness*0.5,0,0,1); pVBuffer->Unlock(); device->SetRenderState(D3DRS_LIGHTING, false); device->SetTransform(D3DTS_WORLD,&w); device->DrawPrimitive(D3DPT_TRIANGLEFAN, 0, 2); device->SetRenderState(D3DRS_LIGHTING, true); pVBuffer->Release(); }

However, strange things happen. (if you already caught some basic math errors or typos, you're probably thinking "duh" at this point)

To try it out, I generated a bunch of random points around the world origin and drew particles between it and them.

At first it seems fine, but the rectangles happen to be stretched when the points are aligned along the world Z axis.

The more they are aligned on the Z axis, the shorter the particles are. Also, they apparently stretch towards the first of the passed points, so switching the points either stretch the particles towards the random points or the origin.

This is what it looks like:

(point 1 is random, point 2 is 0,0,0)

So... What the flying Jupiter am I doing wrong here? XP

EDIT: Oh, if I don't to the rebuilding of Z, it apparently gives me better results.