Archived

This topic is now archived and is closed to further replies.

Edoll

[2d] orthomode causing ugly distortions when applying scaling

Recommended Posts

I recently got advised to use ortho mode in displaying my game instead of my old way, i was manualy calculating all the onscreen positions using my own calculations and variables. Now the 2d game is using a scale so i can zoom in and out, with my old way i just multiplied width and length of a sprite with the scale and then displayed it like normal, this works great and looks perfect. I got advised to use ortho mode for compatibility reasons with ogl (for porting one day) and the fact it would be alot faster on slower cpu''s since now all the calculations are being done by videocard. My problem now however is is that when i set the ortho to display anything more then the 640*480 pixels it distorts the image seriiously i''ve tried alot, since i''m no longer calculating my positions myself everything is floored or ceiled to get consistent positions for sprites and other objects, however this didnt seem to help (much) currently i''m using this to draw my world
	D3DXMATRIX mat;

	D3DXMatrixOrthoOffCenterRH(&mat, (0*scale)+pnt.x, (639*scale)+pnt.x, (479*scale)+pnt.y, (0*scale)+pnt.y, 0, 1);
	GetDevice()->SetTransform(D3DTS_PROJECTION, &mat); 
	//GetDevice()->SetSamplerState(0, D3DSAMP_MAGFILTER, D3DTEXF_LINEAR);

	 GetDevice()->SetSamplerState(0, D3DSAMP_ADDRESSU, D3DTADDRESS_CLAMP);
	 GetDevice()->SetSamplerState(0, D3DSAMP_ADDRESSV, D3DTADDRESS_CLAMP);


	// render it using a loaded texture

	GetDevice()->BeginScene();
	for (int i = 0 ; i < actualTilesInBuffer; i++)
	{
		if (textureArray[i] == -1) GetDevice()->SetTexture(0, NULL);
		else GetDevice()->SetTexture(0, GetTexture(textureArray[i]));
		
		GetDevice()->DrawPrimitiveUP(D3DPT_TRIANGLESTRIP, 2, tileVertices+i*4, sizeof(CUSTOMVERTEX));
	}
	GetDevice()->EndScene();

scale is not just a random float value, the value has been taken out of a table of floats to get integral values for my sprite sizes, i use 32x32 tiles so each zoom step would decrease the tilesize to 31x31 then to30x30 then to 29x29, though i dont know this is nessecary it probably cant hurt.. it was nessecary to avoid distortions and edges on textures with my old method however So i guess the question is, does anyone have an idea why it gets visualy distorted this way and is there any way to fix it?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Any chance of the device driver being the culprit?
Or maybe the card itself?

I remember that on some old cards (like riva zx) there were very serious perspective distortions for large polys. Now, I''m not saying that your card/drivers must be old and/or defective. I just happen to remember an example of a similar problem and things that caused it.

What card do you have?

Share this post


Link to post
Share on other sites
No my card is fine geforce2 gts, and its also been confirmed on different machines by myself and a friend of mine, it has definately to do something with the primtives being so small i think, it i want to zoom out to 1/4 size my basic tiles for the map would be 32/4 in width and length, and somehow i think it keeps changing the size it dislpays every frame, i am quite confident that i have killed any insoncistencies in math on my side by using (int) and floor() for almost every funtion

Share this post


Link to post
Share on other sites
Well.. only one answer seperating me from using world coordinates instead of my current screen coordinates system: can i alter the vertices after they have been through the ortho matrix but before it gets drawn?

Share this post


Link to post
Share on other sites
Yes. I have two ideas on this.

Maybe you could write a matrix transformation that would take the vertices from projection space and adjust them however it is you want. Once you have this matrix, you could concatenate it with your regular projection matrix and set the result as D3D''s projection matrix. This is what I mean:

matPostProj = // matrix to adjust vertices that are in projection space however it is you want to.
D3DXMatrixMultiply(&matNewProj, &matProj, &matPostProj);
m_pd3dDevice->SetTransform(D3DTS_PROJ, &matNewProj);


I''ve never tried this but I think it should work. Whenever I''ve needed to adjust the position of a vertex after the projection matrix has been applied, I always used vertex shaders, which is the second idea that can work for you.

With shaders, the programmer is responsible for transforming all geometry. Somewhere in your vertex shader, you would do something like this (this is basically HLSL):

outputPos = inputPos * matWorldViewProj;

Here you''re just taking the position of the vertex and transforming it by a concatenated World-View-Projection matrix. Since you write this code, there''s nothing stopping you from doing something like this:

tempPos = inputPos * matWorldViewProj;
outputPos = // some manipulation of "tempPos"

Since you''re using a GeForce2, I don''t think you''ll have vertex shaders implemented in hardware. But D3D can make those shaders run in software mode instead. Obviously, software mode shaders are slower than hardware but they''re not all that bad, especially, if your vertex shader is pretty short.

Hope this helps,
neneboricua

Share this post


Link to post
Share on other sites