Spritesheet 3D Transformation Issues

Started by
8 comments, last by kamal7 10 years ago

Hello!

I've been having trouble with 3D space transformations in my 2D game in Directx 9. I have tried countless measures to resolve the issue, researched everywhere but nothing I attempt seems to resolve the issue so I decided to come here to receive direct help. Any help is greatly appreciated!

Here is a visual of what is happening in the game currently with the following code:

Untitled.png

Here is what it is supposed to look like (im using DX9 Sprite to render this but I don't want to use it in further development which is why I want to use my own sprite rendering system with vertex/pixel shaders):

Untitled2.png

As you can see the square tiles are not being scaled/framed properly in the first image compared to the second. The tiles are being read from a spritesheet. The reason you see the trees and mouse sprites fine is because they are from single image spritesheet so no scaling is needed. I have tried to scale them properly, although it improves the sprite rendering... it still isn't showing properly because the large images become a little too small and the small images become a little too big. I have tried many different calculations to try and fix the scaling and position of the sprites in the spritesheet but have failed to resolve the issue. I have also tried running the game in fullscreen mode but nothing changes so I'm assuming the screen size of the game in window mode doesn't affect the scaling. What I do know is that the resolution set in DX9 does affect the scaling but I cannot derive a calculation method to handle that... I have searched everywhere, trying many methods but to no prevail.

EDIT: actually the trees aren't even scaled properly as you can see the difference between them when comparing both images. And since the trees are individual spritesheet images then it seems everything is out of proportion. Also to make things clear, the square tiles are showing the whole spritesheet scaled into a single sprite frame size (32x32 pixels) so you can see that its not positioning and scaling itself properly to the location of the spriteframe in the spritesheet.

Here is the code I use for the first image (I'll try to post the key code and not all of it):


//Here I set my sampler states for safe/efficiency purposes although I'm not sure if this will cause problems in rendering the sprites
	g_pD3DDevice->SetRenderState(D3DRS_ALPHABLENDENABLE, true);
	g_pD3DDevice->SetRenderState(D3DRS_SRCBLEND, D3DBLEND_SRCALPHA);
	g_pD3DDevice->SetRenderState(D3DRS_DESTBLEND, D3DBLEND_INVSRCALPHA);

	g_pD3DDevice->SetRenderState(D3DRS_CULLMODE, D3DCULL_NONE);
	g_pD3DDevice->SetRenderState(D3DRS_LIGHTING, FALSE);
	g_pD3DDevice->SetRenderState(D3DRS_ZENABLE, FALSE);
	g_pD3DDevice->SetRenderState(D3DRS_ZWRITEENABLE, FALSE);
	
	g_pD3DDevice->SetSamplerState(0, D3DSAMP_ADDRESSU, D3DTADDRESS_WRAP);
	g_pD3DDevice->SetSamplerState(0, D3DSAMP_ADDRESSV, D3DTADDRESS_WRAP);
	g_pD3DDevice->SetSamplerState(0 ,D3DSAMP_MINFILTER, D3DTEXF_NONE);
	g_pD3DDevice->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_NONE);
	g_pD3DDevice->SetSamplerState(0, D3DSAMP_MIPFILTER, D3DTEXF_NONE);

// Here are my vertex buffer definitions

typedef struct _CUSTOM2DVERTEX
{
	D3DXVECTOR3 pos; // D3DFVF_XYZ (x,y,z)
	//D3DCOLOR color;
	D3DXVECTOR2 texCoord; // D3DFVF_TEX1 (u,v)
} CUSTOM2DVERTEX, *LPCUSTOM2DVERTEX;

#define D3DFVFCUSTOM2DVERTEX (D3DFVF_XYZ | D3DFVF_TEX2/* | D3DFVF_DIFFUSE*/)

LPDIRECT3DVERTEXBUFFER9 g_pVertexBuffer;

// Here is my vertex buffer creation code

	if (FAILED(hr = g_pD3DDevice->CreateVertexBuffer(4*sizeof(CUSTOM2DVERTEX), usageProcessing | D3DUSAGE_DYNAMIC | D3DUSAGE_WRITEONLY, D3DFVFCUSTOM2DVERTEX, D3DPOOL_DEFAULT, &g_pVertexBuffer, NULL))) {
		MessageBox(g_hWnd, "Failed to create Vertex buffer.", "ERROR!", MB_ICONEXCLAMATION | MB_OK);
		SendMessage(g_hWnd, WM_DESTROY, NULL, NULL);
		return hr;
	}

// I setup my view port here on application startup since it doesn't change, g_wX and g_wY are the screen resolutions (we can assume 640x480) so the scale is based on the resolution size and not on 1.0x1.0

	D3DXMATRIX matOrtho, matProj; 
	D3DXMatrixOrthoLH(&matOrtho, (float)g_wX, (float)g_wY, 0.0f, 1.0f);
	D3DXMatrixOrthoOffCenterLH(&matOrtho, 0.0f, (float)g_wX, (float)g_wY, 0.0f, 0.0f, 1.0f);
	g_pD3DDevice->SetTransform(D3DTS_PROJECTION, &matOrtho);
	
	D3DXMatrixIdentity(&matProj);
	g_pD3DDevice->SetTransform(D3DTS_VIEW, &matProj);
	
// as you can see I recycle my vertex buffer since I only set it once for efficiency so you can expect that I don't recreate it each render period

	g_pD3DDevice->SetStreamSource(0, g_pVertexBuffer, 0, sizeof(CUSTOM2DVERTEX));
	g_pD3DDevice->SetFVF(D3DFVFCUSTOM2DVERTEX);

// and finally here is the code that renders each sprite

// you can assume the scale.xyz values are all 1.0f
// the position.xyz values are all the position of the sprite frame relative to the screen; however, position.z will always be 0.0f in my code

		D3DXMatrixScaling(&scaleMatrix, scale.x, scale.y, scale.z);
		D3DXMatrixTranslation(&transMatrix, position.x, position.y, position.z);
		D3DXMatrixMultiply(&matWorld, &scaleMatrix, &transMatrix);

		DxDraw.g_pD3DDevice->SetTransform(D3DTS_WORLD, &matWorld);

// as you can see the scaling is at its default, so no scaling is done to "select" the sprite frame from a spritesheet to simplify the code as I tried to scale it before with no success

		float minwidthFactor = 0.0f;
		float minheightFactor = 0.0f;
		float maxwidthFactor = 1.0f;
		float maxheightFactor = 1.0f;

// the srcRect variable holds the RECT position data of the sprite frame relative to the spritesheet

		CUSTOM2DVERTEX vertices[] = {
			{ D3DXVECTOR3((float)srcRect.left-0.5f, (float)srcRect.top-0.5f, 0.0f), D3DXVECTOR2(minwidthFactor, minheightFactor) },  // left top
			{ D3DXVECTOR3((float)srcRect.left-0.5f, (float)srcRect.bottom-0.5f, 0.0f), D3DXVECTOR2(minwidthFactor, maxheightFactor) },  // left bottom
			{ D3DXVECTOR3((float)srcRect.right-0.5f, (float)srcRect.top-0.5f, 0.0f), D3DXVECTOR2(maxwidthFactor, minheightFactor) },  // right top
			{ D3DXVECTOR3((float)srcRect.right-0.5f, (float)srcRect.bottom-0.5f, 0.0f), D3DXVECTOR2(maxwidthFactor, maxheightFactor) },  // right bottom
		};
		
		LPVOID lpVertices;
		DxDraw.g_pVertexBuffer->Lock(0, sizeof(vertices), &lpVertices, D3DLOCK_DISCARD);
		memcpy(lpVertices, vertices, sizeof(vertices));
		DxDraw.g_pVertexBuffer->Unlock();

		DxDraw.g_pD3DDevice->SetTexture(0, m_texture);

		DxDraw.g_pD3DDevice->DrawPrimitive(D3DPT_TRIANGLESTRIP, 0, 2);

On a side note, I want to create this sprite rendering system similar to how D3DXSPRITE works (as you can see in the second image example). The reason for this new sprite system is so I can use vertex/pixel shaders with maximum freedom for the next stage of development.

Thanks in advance for any provisions! Please feel free to ask any questions about my code incase I missed something or if it is confusing.

Advertisement

does anyone have any feedback?

bump

You are passing the texture-co-ordinates into the pos member of your custom vertex, not into the UV. You are passing your placeholder (0 and 1) values as the texture co-ordinates. So unless you are doing something special in the shader (in which case we need to see it) then your texture co-ordinates are always 0 - 1 i.e. the entire texture is being mapped onto the quad.

Show us your shader.

Also: Your vertex only has one texture coordinate. The correct FVF is therefore D3DFVF_XYZ | D3DFVF_TEX1. D3DFVF_TEX2 is for two tex coords.

Sorry for taking a long while to respond as I have been busy with school. Anyways here is the changes made (I haven't been using the vertex/pixel shaders before, instead I transformed beforehand although since you guys asked for it then I presume it may be required so I set them up).

Here is the following changes made to include the vertex/pixel shaders (the issue still stands after implementing the vertex/pixel shaders):


/// here are the declarations/initializations for the vertex/pixel shaders


	LPDIRECT3DVERTEXDECLARATION9 vbDecl;

	D3DVERTEXELEMENT9 VertexPosElements[] = {
		{0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0}, // x, y, z
		{0, 12, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0}, // u, v
		D3DDECL_END()
	};
	if (FAILED(hr = g_pD3DDevice->CreateVertexDeclaration(VertexPosElements, &vbDecl))) return hr;
	if (FAILED(hr = g_pD3DDevice->SetVertexDeclaration(vbDecl))) return hr; // reverted from using SetFVF
	
	if (FAILED(hr = g_pD3DDevice->SetStreamSource(0, g_pVertexBuffer, 0, sizeof(CUSTOM2DVERTEX)))) return hr;
	
	LPD3DXBUFFER pBuffer, pErrors;

	if (FAILED(hr = D3DXCompileShaderFromResource(
        NULL,
        MAKEINTRESOURCE(IDR_RCDATA3),
        NULL, // CONST D3DXMACRO* pDefines,
        NULL, // LPD3DXINCLUDE pInclude,
        "RenderSceneVS", 
        "vs_1_1",
        D3DXSHADER_USE_LEGACY_D3DX9_31_DLL,
        &pBuffer,
        &pErrors, // error messages
        &g_pConstantTableVS))) {
			MessageBox(g_hWnd, (LPCSTR)pErrors->GetBufferPointer(), "Vertex Shader Compile Error!", MB_ICONEXCLAMATION | MB_OK);
			SendMessage(g_hWnd, WM_DESTROY, NULL, NULL);
			return hr;
	}

	if(FAILED(hr = g_pD3DDevice->CreateVertexShader((DWORD *)pBuffer->GetBufferPointer(), &g_pVertexShader))) return hr;
	pBuffer->Release();

	if(FAILED(hr = g_pD3DDevice->SetVertexShader(g_pVertexShader))) return hr;

	D3DXMATRIX matOrtho; 
	D3DXMatrixOrthoLH(&matOrtho, (float)g_wX, (float)g_wY, 0.0f, 1.0f);
	D3DXMatrixOrthoOffCenterLH(&matOrtho, 0.0f, (float)g_wX, (float)g_wY, 0.0f, 0.0f, 1.0f);
	g_pD3DDevice->SetTransform(D3DTS_PROJECTION, &matOrtho);

	if (FAILED(hr = D3DXCompileShaderFromResource(
        NULL,
        MAKEINTRESOURCE(IDR_RCDATA2),
        NULL, // CONST D3DXMACRO* pDefines,
        NULL, // LPD3DXINCLUDE pInclude,
        "RenderScenePS", 
        "ps_1_1",
        D3DXSHADER_USE_LEGACY_D3DX9_31_DLL,
        &pBuffer,
        &pErrors, // error messages
        &g_pConstantTablePS))) {
			MessageBox(g_hWnd, (LPCSTR)pErrors->GetBufferPointer(), "Pixel Shader Compile Error!", MB_ICONEXCLAMATION | MB_OK);
			SendMessage(g_hWnd, WM_DESTROY, NULL, NULL);
			return hr;
	}

	if(FAILED(hr = g_pD3DDevice->CreatePixelShader((DWORD *)pBuffer->GetBufferPointer(), &g_pPixelShader))) return hr;
	pBuffer->Release();

	if(FAILED(hr = g_pD3DDevice->SetPixelShader(g_pPixelShader))) return hr;

// here are the matrix transformations, these calculations are currently done every time a single sprite frame is being rendered

		D3DXMATRIX scaleMatrix, transMatrix, matWorld, matView, matProj;
		D3DXMatrixScaling(&scaleMatrix, scale.x, scale.y, scale.z);
		D3DXMatrixTranslation(&transMatrix, position.x, position.y, position.z);
		D3DXMatrixMultiply(&matWorld, &scaleMatrix, &transMatrix);
		DxDraw.g_pD3DDevice->SetTransform(D3DTS_WORLD, &matWorld);

		DxDraw.g_pD3DDevice->GetTransform(D3DTS_WORLD, &matWorld);
		DxDraw.g_pD3DDevice->GetTransform(D3DTS_VIEW, &matView);
		DxDraw.g_pD3DDevice->GetTransform(D3DTS_PROJECTION, &matProj);

		D3DXMATRIXA16 matWorldViewProj = matWorld * matView * matProj;

		DxDraw.g_pConstantTableVS->SetMatrix(DxDraw.g_pD3DDevice, "WorldViewProj", &matWorldViewProj);

// here is the vertex shader .hlsl file contents


// Vertex shader input structure
struct VS_INPUT
{
	float4 Position : POSITION;
	float2 Texture : TEXCOORD0;
};

// Vertex shader output structure
struct VS_OUTPUT
{
	float4 Position : POSITION;
	float2 Texture : TEXCOORD0;
};

// Global variables
float4x4 WorldViewProj;

VS_OUTPUT RenderSceneVS(in VS_INPUT In)
{	
	VS_OUTPUT Out;

	Out.Position = mul(In.Position, WorldViewProj); //apply vertex transformation
    Out.Texture = In.Texture; //copy original texcoords

	return Out;
}

// here is the pixel shader .hlsl file contents

sampler2D Tex0;

// Pixel shader input structure
struct PS_INPUT
{
    float4 Position : POSITION;
    float2 Texture : TEXCOORD0;
};

// Pixel shader output structure
struct PS_OUTPUT
{
    float4 Color : COLOR0;
};

PS_OUTPUT RenderScenePS(in PS_INPUT In)
{	
    PS_OUTPUT Out; //create an output pixel

    Out.Color = tex2D(Tex0, In.Texture); //do a texture lookup

	return Out;
} 

I also changed D3DFVF_TEX2 to D3DFVF_TEX1 as suggested. I had it set to D3DFVF_TEX2 previously when I was testing something else out, but had changed a lot for simplicity to post it on the forum... I seem to have forgotten to set D3DFVF_TEX1 in the process; nevertheless, it has been changed for correctness.

Also, I am not sure Aardvajk what you mean when you say I am mixing the (u,v) texture coordinates with the position (x,y,z) coordinates. Are you implying I must use the following code instead?


CUSTOM2DVERTEX vertices[] = {
			{ D3DXVECTOR2(minwidthFactor, minheightFactor), D3DXVECTOR3((float)srcRect.left-0.5f, (float)srcRect.top-0.5f, 0.0f) },  // left top
			{ D3DXVECTOR2(minwidthFactor, maxheightFactor), D3DXVECTOR3((float)srcRect.left-0.5f, (float)srcRect.bottom-0.5f, 0.0f) },  // left bottom
			{ D3DXVECTOR2(maxwidthFactor, minheightFactor), D3DXVECTOR3((float)srcRect.right-0.5f, (float)srcRect.top-0.5f, 0.0f) },  // right top
			{ D3DXVECTOR2(maxwidthFactor, maxheightFactor), D3DXVECTOR3((float)srcRect.right-0.5f, (float)srcRect.bottom-0.5f, 0.0f) },  // right bottom
		};

Or perhaps?


		CUSTOM2DVERTEX vertices[] = {
			{ D3DXVECTOR3(0.0f, 0.0f, 0.0f), D3DXVECTOR2((float)srcRect.left, (float)srcRect.top) },  // left top
			{ D3DXVECTOR3(0.0f, 1.0f, 0.0f), D3DXVECTOR2((float)srcRect.left, (float)srcRect.bottom) },  // left bottom
			{ D3DXVECTOR3(1.0f, 0.0f, 0.0f), D3DXVECTOR2((float)srcRect.right, (float)srcRect.top) },  // right top
			{ D3DXVECTOR3(1.0f, 1.0f, 0.0f), D3DXVECTOR2((float)srcRect.right, (float)srcRect.bottom) },  // right bottom
		};

I tried both so either I'm misunderstanding you or it just simply doesn't work out as hoped. Please clarify, thanks!

PS: I've noticed that the vertex/pixel shaders won't allow me or will cause even more issues if I use float3 (as opposed to float4) for my POSITION declaration. Does this imply that I must use D3DFVF_XYZRHW instead of D3DFVF_XYZ in my custom vertex?

On my phone so hard to read the code but seems you have omitted to initialise matProj.

D3D will take care of converting your vec3s to vec4 when they are passed to the shader so don't worry.

Your second bit of code above looks correct based on your vertex structure but don't fully understand what your position values are supposed to be and can't make sense of them without seeing your projection matrix.

XYZRHW is for pretransformed vertices so not what you want. Not sure what happens with shaders but with fixed function this vertex type means the world view and projection transforms are skipped.

On my phone so hard to read the code but seems you have omitted to initialise matProj.

D3D will take care of converting your vec3s to vec4 when they are passed to the shader so don't worry.

Your second bit of code above looks correct based on your vertex structure but don't fully understand what your position values are supposed to be and can't make sense of them without seeing your projection matrix.

XYZRHW is for pretransformed vertices so not what you want. Not sure what happens with shaders but with fixed function this vertex type means the world view and projection transforms are skipped.

My projection matrix is set during the application initialization process on the following lines of code:


	D3DXMATRIX matOrthoProj;
	D3DXMatrixOrthoLH(&matOrthoProj, (float)g_wX, (float)g_wY, 0.0f, 1.0f);
	D3DXMatrixOrthoOffCenterLH(&matOrthoProj, 0.0f, (float)g_wX, (float)g_wY, 0.0f, 0.0f, 1.0f);
	g_pD3DDevice->SetTransform(D3DTS_PROJECTION, &matOrthoProj);

The g_wX and g_wY variables are just the resolution of the application (in this case assume g_wX = 640, and g_wY = 480).

Or perhaps you meant what is posted in the first post and are referring to my D3DTS_VIEW setting.


D3DXMatrixIdentity(&matView);
g_pD3DDevice->SetTransform(D3DTS_VIEW, &matView);

In that case, no I am not setting it to anything really. Am I supposed to?

The position values used in the vertex parameters are the relative coordinates (pixel coordinates) of the sprite frame RECT on the individual texture image referenced. I am not 100% sure this is the correct way to do this as this may be part of the issue at hand.

Also, thanks for clarifying that I don't need to use all XYZRHW.

The relative pixel co-ordinates on the sprite frame rect need to be the texture coordinates, running 0 - 1 across and 0 - 1 down the texture. Since you are using a world transform to move each position to the correct place, your input positions need to be the positions of the corners of the sprite as if it was at 0,0 in your co-ordinate space.

So you need to set up each vertex so position is the position of the vertex if the sprite was at 0,0, then the world transform takes care of moving it to the correct place. You set up your texture co-ordinates with the correct offsets into the texture space to sample the correct part of the texture for the sprite.

Screen co-ordinates do not matter here as you have an orthographic projection set up correctly. Your positions need to be in the orthgraphic space so if you are running, say, 10 units across and 8 units down (g_Wx and g_Wy), a point at the centre of the screen would be positioned at 5, 4 but would need to be input to the vertex shader as 0,0 as your world transform will then move it to the correct place.

I resolved the issue. One of the main problems I had before was the fact that directx was resizing the textures to have power 2 dimensions.

This topic is closed to new replies.

Advertisement