Jump to content

  • Log In with Google      Sign In   
  • Create Account


Boulougou

Member Since 05 Jan 2005
Offline Last Active May 05 2014 03:07 PM
-----

Topics I've Started

Glicko 2 rating deviaton during periods of inactivity

18 April 2013 - 03:48 AM

Hello,

 

I am trying to use glicko 2 rating system in my game. I have implemented it using the formulas described in the document. However I am not sure I understand correctly how the rating deviation φ of a player is increased after periods of inactivity. The document mentions that when a player does not compete during a "rating period" the deviation φ is updated using this formula:

 

φ = sqrt(φ^2 + σ^2) (could not embed the formula, it seems that image extensions has been disabled)

 

where σ is rating volatility.

 

But using that formula it seems that φ is increased in a very slow rate. For example if a "rating period" is one month and a player has φ = 50 and σ = 0.06(the recommended value for volatility), after 10 years(120 "rating periods") of inactivity φ' = 50.004189. Is that the expected behavior or I am missing something? Is there any way to adjust how "quickly" φ is increased during player inactivity?

 

Let's hope that there is someone that has used this algorithm in the forums.

Thanks in advance!


.X file template bug on 64 bit.

18 July 2008 - 10:37 AM

I am programming an application for loading skinned models from .X files. I am using Visual Studio 2005, Windows Vista 64 bit and DirectX9. Because my animations are playing too fast, I am using the AnimTicksPerSecond template for scaling the speed. The description for this template on March 2008 SDK is: --------------- AnimTicksPerSecond Describes an animation scale factor. template AnimTicksPerSecond { < 9E415A43-7BA6-4a73-8743-B73D47E88476 > DWORD AnimTicksPerSecond; } Where: AnimTicksPerSecond - A scale factor to convert animation time stamps to global time. Its value divides the animation time by either 30, 60, 24 or 25. --------------- So I placed on my .X files the following line to slow the speed: AnimTicksPerSecond { 24; } When I compile the application for 32 bit machines the animation is scaled and it plays fine. When I compile for 64 bit machines the animation continues to play too fast, as if the DirectX ignores the existance of the AnimTicksPerSecond template that I placed on my .X files. So I installed the SkinnedMesh sample of DirectX SDK and I tried it with one of my .X files. Same results with my application. The animation played fine on 32 bit executable but too fast with 64 bit executable. Any ideas of what it's going wrong? --------------------- UPDATE: I also test it with tiny.x model from direct sdk samples. I add the AnimTicksPerSecond template in the tiny.x file to scale the animation speed. On the 32 bit executable of the SkinnedModel sample it worked. On 64 bit executable ignored the template. --------------------

Normal transformation in vertex shaders

03 September 2006 - 02:17 AM

Hi, I see in many shaders that vertex normals are being transformed with the world transformation( including scale ) and then they are normalized to regain the unit length. The code is usually like this: matrix4x4 m_worldViewProj; matrix4x4 m_world; VS_OUTPUT vs_main(VS_INPUT In) { VS_OUTPUT Out = (VS_OUTPUT)0; Out.Pos = mul(In.Pos,m_worldViewProj); float3 world_normal = normalize(mul(In.Norm, (float3x3)m_world)); // etc... } Wouldn't it be better if we pass a 3x3 rotation matrix ( without the scale, only world rotation ) and transform the normal without normalizing it? Or the speed we gain because of the lesser instructions( no normalize() call ) is lost due to the extra parameter pass( the extra 3x3 matrix for the rotation )? matrix4x4 m_worldViewProj; matrix4x4 m_world; matrix3x3 m_worldRotation; VS_OUTPUT vs_main(VS_INPUT In) { VS_OUTPUT Out = (VS_OUTPUT)0; Out.Pos = mul(In.Pos,m_worldViewProj); float3 world_normal = mul(In.Norm, m_worldRotation); // etc... }

Retrieving triangles from a vertex buffer. Help needed.

11 October 2005 - 11:07 PM

I'm using the index buffer to get the vertices of the triangles from the vertex buffer. Then with the vertices, I create the triangles. The triangles are stored as a D3DTRIANGLE_LIST in the buffers. I suspect that there is something wrong with the way that I retrieving the vertices from the vertex buffer. Is there any other way to retrieve the position of the vertices from a vertex buffer?


// Get the vertex declaration of the mesh.
D3DVERTEXELEMENT9 vertexDeclaration[MAX_FVF_DECL_SIZE];
pMesh->GetDeclaration( vertexDeclaration );

// Find the offset of the declaration.
DWORD offset = 0;

// Scan the vertex declaration array.
int i = 0;
while( vertexDeclaration[i].Type != D3DDECLTYPE_UNUSED )
{
	if( vertexDeclaration[i].Usage == D3DDECLUSAGE_POSITION )
	{
		offset = vertexDeclaration[i].Offset;
		break;
	}

	i++;
}

// Lock the vertex and index buffer of the mesh.
void* pData = 0;
WORD* pIndexData = 0;
pMesh->LockVertexBuffer( D3DLOCK_READONLY, (void**)&pData );
pMesh->LockIndexBuffer( D3DLOCK_READONLY, (void**)&pIndexData );

// Scan the triangles.
D3DXVECTOR3 vertexPosition[3];
Triangle3 tri;
for( DWORD i = 0, iTriangle = 0; iTriangle < pMeshContainer->MeshData.pMesh->GetNumFaces(); i+=3, iTriangle++ )
{	
	// Get the position.
	memcpy( &vertexPosition[0], &((D3DVERTEXELEMENT9*)pData)[pIndexData[i]] + offset, sizeof(D3DXVECTOR3) );
	memcpy( &vertexPosition[1], &((D3DVERTEXELEMENT9*)pData)[pIndexData[i+1]] + offset, sizeof(D3DXVECTOR3) );
	memcpy( &vertexPosition[2], &((D3DVERTEXELEMENT9*)pData)[pIndexData[i+2]] + offset, sizeof(D3DXVECTOR3) );

	// Build the triangle.
	tri.origin() = vertexPosition[0];
	tri.edge0() = vertexPosition[1] - vertexPosition[0];
	tri.edge1() = vertexPosition[2] - vertexPosition[0];

	// Store it in the triangle list.
	memcpy( pTriangleList[iTriangle].triangle, &tri, sizeof(Triangle3) ); 
}

pMesh->UnlockVertexBuffer();
pMesh->UnlockIndexBuffer();

Using Vertex Shaders and keeping Bounding Volumes up to date

29 July 2005 - 09:25 PM

I'm using a vertex shader that makes meshes appear bigger. This is done by translating every vertex to the direction of its normal. The problem is that I cannot keep the bounding volumes of the meshes up to date because every vertex has it's own unique transformation and it cannot be applied to the bounding volume. The same happens when I animate a skinned mesh with my animation shader. This shader is very similar to the shader used in MultiAnimation sample of DirectX SDK. It transforms each vertex by the bones' offset matrices and then applies the bone matrices. The problem is that the bounding volume doesn't appear in the right place, because the vertices have been transformed by the shader. Again, I cannot apply a transformation to the bounding volume. Each vertex is transformed in its own way, influenced by many bones. Is there any solution to that?

PARTNERS