Sign in to follow this  

Rendering sprites with shaders (3D plane problems)

This topic is 3099 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Currently I'm trying to implement sprite drawing using shaders, and right now I'm just using the "sprite" (IDirect3DTexture9) as the texture for a 3D plane that I attempt to render normally using a shader. I'm not sure if there is an easier method, so if there is I'd be very interested in knowing it. Right now though, all I see is a black screen. Dunno what the deal is. Here is how I set up my plane and attempt to render it:

struct PlaneVertex 
{
	float x, y, z, w; //vector4
	float u, v;       //for texturing
};

//Create our objects plane to render
static PlaneVertex axPlaneVertices[] =
{
        { 0,              0,             0, 1, 0,0 },
        { obj->m_Size.x,  0,             0, 1, 1,0 },
        { obj->m_Size.x , obj->m_Size.y, 0, 1, 1,1 },
        { 0,              obj->m_Size.y, 0, 1, 0,1 }
};

int passes;
shader->Begin((UINT*)&passes,0);
for(int pass = 0; pass < passes; pass++)
{
	shader->BeginPass(pass);
	m_pDevice->SetFVF(D3DFVF_XYZRHW | D3DFVF_TEX1);
	m_pDevice->DrawPrimitiveUP( D3DPT_TRIANGLEFAN, 
				    2, 
			            axPlaneVertices, 
				    sizeof(PlaneVertex));
	shader->EndPass();
}
shader->End();


Now assuming I've set the shader up correctly, shouldn't this work the same as rendering a ID3DXMesh with a shader? Also, wouldn't I be allowed to rotate and scale this? As well as rotate around the object using a 3D camera? Something has to be horribly wrong because I can't do any of those things, all I see is black.

Share this post


Link to post
Share on other sites
Hi,

You forgot to send your texture data to shader. Before shader->Begin() call, put SetTexture() function like this:
shader->SetTexture ("SpriteTexture", yourSpriteTexture);

There's no need to declare any vertex shader here. Only a pixel shader can perform it:

uniform extern texture SpriteTexture;

sampler2D smpSprite =
sampler_state
{
texture = <SpriteTexture>;
MinFilter = LINEAR;
MagFilter = LINEAR;
MipFilter = LINEAR;
AddressU = CLAMP;
AddressV = CLAMP;
};

float4 RenderSprite (float2 t: TEXCOORD0) : COLOR0
{
return tex2D (smpSprite, t);
}


Hope this helps.
Rohat.

Share this post


Link to post
Share on other sites
Well, the texture wasn't the problem. The problem is that I wasn't passing in the normals so lighting my diffuse lighting was screwed up. Thanks for making me recheck my shader!

Now my problem is that my transformation matrices and camera don't actually do anything while rendering the plane. Do I need to do anything special so that the vertices of my plane are transformed correctly by the vertex shader?

Here is my current texture shader:


float4x4 WorldViewProjection;

texture DiffuseTex;

sampler DiffuseSampler = sampler_state
{
texture = <DiffuseTex>;
AddressU = WRAP;
AddressV = WRAP;
AddressW = WRAP;
MIPFILTER = LINEAR;
MINFILTER = LINEAR;
MAGFILTER = LINEAR;
};

struct VS_IN
{
float4 pos : POSITION;
float2 tex : TEXCOORD0;
};

struct PS_IN
{
float4 pos : POSITION;
float2 tex : TEXCOORD0;
};

PS_IN TextureVertexShader(VS_IN input)
{
PS_IN output;

output.pos = mul(input.pos, WorldViewProjection);
output.tex = input.tex;

return output;
}

float4 TexturePixelShader(PS_IN input): COLOR
{
return tex2D(DiffuseSampler, input.tex);
}

technique Technique0
{
pass P0
{
VertexShader = compile vs_2_0 TextureVertexShader();
PixelShader = compile ps_2_0 TexturePixelShader();
}
}



Another thing I noticed is when I adjust the z values for the plane, it cuts off the texture in a weird fashion instead of making the plane tilted and still render as a 3D object.

Share this post


Link to post
Share on other sites
If you're using D3DFVF_XYZRHW flag, you don't need to do any transformation. And this means, you don't need to write a vertex shader.

Lighting is not important while rendering full-screen quads such as sprites. So, you don't need to use vertex normals.

Share this post


Link to post
Share on other sites
Quote:
Original post by programci_84
If you're using D3DFVF_XYZRHW flag, you don't need to do any transformation. And this means, you don't need to write a vertex shader.

Lighting is not important while rendering full-screen quads such as sprites. So, you don't need to use vertex normals.


Thanks! I've fixed things up a bit, and changed over to D3DFVF_XYZ so that I can use a vertex shader. What I want to do now is add normals, tangents, and binormals to my vertex struct so that I can do Dot 3 per-pixel bump mapping. Also at some point I think I'll need to implement alpha blending so that the parts of the quad without any color are see through.

This might seem like overkill, but I want to attempt to do 3D lighting on 2D objects. I don't know of anyway to do that with ID3DXSprite, so I'm shoving all the sprite stuff into textures and onto a quad and applying vertex/pixel shaders to it.

I think I can manage the calculation of the normal, tangent, and binormal on my own, but how should I set up the FVF for the vertex? What I want is something like:


struct Vertex
{
Vector3 WPos;
Vector3 Norm;
Vector3 Tang;
Vector3 Binom;
float u, v;

enum FVF
{
FVF_Flags = D3DFVF_XYZ | D3DFVF_NORMAL | D3DFVF_TEX1 //This is wrong!
};
}



But I don't think the FVF flags have any support for the tangent and binormal. All I can find is D3DFVF_XYZ, D3DFVF_NORMAL, and D3DFVF_TEX1. Any idea on how to do this?

Share this post


Link to post
Share on other sites
If you're using shaders, you almost certainly want to use vertex declarations, which do the same job as FVF codes, but are more flexible.

See D3DVERTEXELEMENT9, IDirect3DDevice9::CreateVertexDeclaration, and the SDK samples for example usage.

EDIT: For your vertex struct you'd want:

D3DVERTEXELEMENT9 decl[] = {
{0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0},
{0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0},
{0, 24, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TANGENT, 0},
{0, 36, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_BINORMAL, 0},
{0, 48, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0},
D3DDECL_END
};
IDirect3DVertexDeclaration9* pDecl = NULL;
HRESULT hResult = pDevice->CreateVertexDeclaration(decl, &pDecl);

(Untested, off the top of my head).

Share this post


Link to post
Share on other sites
Quote:
Original post by Evil Steve
....


This stuff looks promising. I would be using this in combo with a LPDIRECT3DVERTEXBUFFER9 correct? So I can just use the same struct I posted above (minus the FVF enum), make a buffer of those elsewhere, and copy those over into a LPDIRECT3DVERTEXBUFFER9 and then render the same way I am currently right?

I'll have to test this out.

Share this post


Link to post
Share on other sites
Quote:
Original post by eDuDe
This stuff looks promising. I would be using this in combo with a LPDIRECT3DVERTEXBUFFER9 correct? So I can just use the same struct I posted above (minus the FVF enum), make a buffer of those elsewhere, and copy those over into a LPDIRECT3DVERTEXBUFFER9 and then render the same way I am currently right?
Yup, exactly. vertex declarations are the new replacement for FVF codes. FVF codes don't exist in D3D10 either, so learning vertex declarations will help the transition if you choose to look into D3D10.

Share this post


Link to post
Share on other sites
Just to clarify, when using CreateVertexBuffer(), I just use NULL instead of an FVF right? I've only used LPDIRECT3DVERTEXBUFFER9 with FVFs before (for point sprites).

So it would be something like:


m_pDevice->CreateVertexBuffer(
sizeof(decl)*4,
D3DUSAGE_DYNAMIC | D3DUSAGE_WRITEONLY | D3DUSAGE_POINTS ,
NULL, //instead of an FVF
D3DPOOL_DEFAULT,
&m_VertexBuffer,
NULL);



I guess I would like clarification that I can even do sizeof(decl)*4 for the length of the buffer, but it seems logical that I should. Or should I do sizeof(pDecl) (from your example)?

Share this post


Link to post
Share on other sites
Quote:
Original post by eDuDe
Just to clarify, when using CreateVertexBuffer(), I just use NULL instead of an FVF right? I've only used LPDIRECT3DVERTEXBUFFER9 with FVFs before (for point sprites).
Correct, you should use 0 or NULL for the FVF.

Quote:
Original post by eDuDe
I guess I would like clarification that I can even do sizeof(decl)*4 for the length of the buffer, but it seems logical that I should. Or should I do sizeof(pDecl) (from your example)?
Nope, the size is the size of the vertex struct still, not the vertex declaration. The vertex declaration just tells D3D how to interpret the vertex stream. So you still want sizeof(Vertex)*4.

Share this post


Link to post
Share on other sites
Awesome. Time to get this stuff working to see if I run into any other problems. Why do I wish I could high five you through the internet?

Share this post


Link to post
Share on other sites

This topic is 3099 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this