direct 3d design questsions

Started by
21 comments, last by superpig 15 years, 11 months ago
If you want to batch sprites that use different textures, you'll have to combine those textures into a single texture atlas and handle the adressing. Also if you want to render multiple quads with a single DrawPrimitive call but without using a dynamic vertex buffer, you should look into the various methods of instancing available in D3D9. There's a sample in the SDK, and also a good one on Humus's page.

However I really don't think you're going to need to worry about this too much...unless you're rendering 1000's of sprites a frame I doubt you're going to incur too much CPU overhead. In 99% of cases you should be fine with the batching ID3DXSprite offers, which batches together sprites of the same texture.
Advertisement
I don't meen bacting diffrent textures just objects in diffrent places.
Eg is there a way in a single DrawPrimitive call I could draw these quads but also without having to build a new vertex/index buffer every frame?
x    y    rot100  50   30 -50  200  300200  75   180180  70   100


[Edited by - Sync Views on May 18, 2008 4:16:05 AM]
Use a custom vertex shader, and build a vertex buffer that has multiple rectangles in it. Each vertex of each rectangle would include an integer to tell you which "number" rectangle it was a part of, and then you could use that number to choose which set of transform params to use.



Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

ok... I have 0 idea where to start and how what your saying works because I thaught vertex shaders were just the colours of something not where something is?

So you saying I can use a custom shader so I can send say 20 diffrent transform matrixes and have them transform diffrent quads rather than one matrix that effects all the quads in the buffer I'm drawing?
Quote:Original post by Sync Views
ok... I have 0 idea where to start and how what your saying works because I thaught vertex shaders were just the colours of something not where something is?
Pixel shaders can only affect the colour (and depth), but vertex shaders can mess with any aspect of the vertex. Everything that the fixed-function transform and lighting pipeline does - SetTransform/SetLight stuff - can be done in vertex shaders.

Quote:So you saying I can use a custom shader so I can send say 20 diffrent transform matrixes and have them transform diffrent quads rather than one matrix that effects all the quads in the buffer I'm drawing?
Yep, pretty much.

Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

Ok well ive starting looking at how vertex shaders work and how to create use them.

What I don't get is how I'm meant to add "custom" data to it as there are no "custom" entrys for the D3DDECLUSAGE_* types.
	D3DVERTEXELEMENT9 vShaderElements[] =	{		{0, 0,  D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0},//xyz -whats the deal with -1,1 ranges? Should they always use that cause Ive been using like -1000 to 1000 for large stuff like backgrounds		{0, 12, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0},//uv		D3DDECL_END()	};


It looks as if I can only draw as many vertices as are in the buffer (eg I can't draw a quad for every matrix using the same 4 input vertices). Should I just create like 100 quads in the buffer or something then so I'm pretty sure I'm unlikly to run out of quads and have to split drawing for that texture into 2 blocks? Also how fast is sending matrixes and stuff for use in a vertex shader?
Quote:Original post by Sync Views
What I don't get is how I'm meant to add "custom" data to it as there are no "custom" entrys for the D3DDECLUSAGE_* types.
From the docs:
Quote:D3DDECLUSAGE_TEXCOORD can be used for user-defined fields (which don't have an existing usage defined).


-1..+1 ranges are only involved if the data type is one of the normalized formats (e.g. D3DDECLTYPE_UBYTE4N). None of the float formats are normalized.

Quote:It looks as if I can only draw as many vertices as are in the buffer (eg I can't draw a quad for every matrix using the same 4 input vertices). Should I just create like 100 quads in the buffer or something then so I'm pretty sure I'm unlikly to run out of quads and have to split drawing for that texture into 2 blocks?
Yes, that's the idea. If you happen to have > 100 quads, then you do it in two batches; but this won't happen often if you've got enough quads in the buffer.

Quote:Also how fast is sending matrixes and stuff for use in a vertex shader?
It's pretty fast. However, if the quads are always the same size and always have the same texture coordinates, then you don't need to send a full matrix; if you only need to send the position on screen, that's just an x/y pair - two float values. Packing in rotations is a little more involved, but still, if this is 2D then you should only need a 2x3 matrix, which you can pack into two registers. As a general rule, the smaller the size of the data you're sending to the vertex shader, the faster it will be to send.

You could also look into the possibility of preserving data from frame to frame - you don't necessarily have to reupload /all/ the params again every time, you might be able to get away with only reuploading those that have changed.

Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

Does writing my own vertex shader by pass the world, projection and view transforms or is it just the first two? (I'm not complelty sure what the diffrence between the last two is anyways...)
Quote:Original post by Sync Views
Does writing my own vertex shader by pass the world, projection and view transforms or is it just the first two?
It bypasses all three. It also bypasses the Direct3D lighting engine (D3DLIGHT9, SetLight/LightEnable).

Quote:(I'm not complelty sure what the diffrence between the last two is anyways...)
Take a look at the SDK docs - DirectX Graphics -> Programming Guide -> Getting Started -> Transforms.

Loosely speaking, the view transform is used to account for the camera's position and orientation in the world, while the projection transform is used to simulate the camera's "lens."

Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

So I need my shader to:
-Use the appropiate world transform arcording to the vertexes quad index
-Use the current view transfrom (so camra xyz, angle)
-Use the current projection transfrom (perspective?)

Also all the 2d D3D stuff Ive seen has drawn struff in the order it's wanted.
Is there a reason people do it that way rather than:
-Setting a projection with no perspective
-Drawing quads using the z axis so the grafics card can decide if things are above or below each other thus allowing me to draw everything with a single texture in one go.

This topic is closed to new replies.

Advertisement