couple questions about HLSL and effect files *solved*

Started by
11 comments, last by cNoob 15 years, 10 months ago
After many hours I finally got my renderer to load an effect, use it and not crash. My main question is about passing data to the shaders and how they know what's in a vertex buffer. I have my vertex declaration setup as so.

D3DVERTEXELEMENT9 dec[4] =
	{
	 {0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION,0},
	 {0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0},
	 {0, 24, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD,0},
	 D3DDECL_END()
	};


and this is the base effect file, that I'm starting out with.


struct VS_OUTPUT
{
    float4 Pos : POSITION; 
};

struct PS_OUTPUT
{
    float4 Color : COLOR0;
};

VS_OUTPUT VShader(float4 Pos : POSITION)
{
    VS_OUTPUT Out;
    Out.Pos = Pos;
    return Out;
}

PS_OUTPUT PShader(float Color : COLOR0)
{
    PS_OUTPUT Out;
    Out.Color = Color;
    return Out;
}

technique RenderScreen
{
    pass P0
    {
        vertexshader = compile vs_2_0 VShader();
        pixelshader = compile ps_2_0 PShader();
    }
}


Question 1: The function arguments, are they automatically read from the vertex buffer, as long as they match up with my vertex declaration? Question 2: Even though I have Position, Normal and UV Coords in my declaration, do I have to have the same arguments in the function calls? or can I just have the position and UV coords and everything will be honky dory? Question 3: Looking at examples of shaders, I always see the pixel shader with the color argument like I have it... Is the color automatically passed to it by DirectX or is my use of it worthless as is? Question 4: What exactly are samplers? Are they points to Texture/Buffer/Surface (in lamens terms, image)? I'm sure I'll have more questions later. Thank you guys for all your help so far. [Edited by - freeworld on June 6, 2008 9:41:38 PM]
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
Advertisement
OK a little more work, and I'm getting this weird stretching of texture. I'm assuming I'm not reading from the UV coordinate properly, as it look like its just taking the far left pixels and drawing them all the way across.

This is an image of what I'm talking about.
Photobucket

and here's the shaders I'm using.
float4x4 WorldViewProjection : WORLDVIEWPROJECTION;struct VS_OUTPUT{    float4 Pos : POSITION;     float4 UV  : TEXCOORD0;};struct PS_OUTPUT{    float4 Color : COLOR0;};sampler2D Texture;VS_OUTPUT VShader(float4 Pos : POSITION,                  float4 UV  : TEXCOORD0){    VS_OUTPUT Out;    Out.Pos = mul(Pos, WorldViewProjection);    Out.UV  = UV;    return Out;}PS_OUTPUT PShader(float Color : COLOR0,                  float4 UV  : TEXCOORD0){    PS_OUTPUT Out;    Out.Color = tex2D(Texture, UV);    return Out;}technique RenderScreen{    pass P0    {        vertexshader = compile vs_2_0 VShader();        pixelshader = compile ps_2_0 PShader();    }}
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
In your PShader function header, should Color be of type float4?
I'm looking forward to hearing the answers to these questions as I'm hoping to dive into shaders over the summer.

EDIT: Then again, I doubt that would make much of a difference as your not using the Color variable in the pixel shader anyway.
Quote:Original post by cNoob
In your PShader function header, should Color be of type float4?
I'm looking forward to hearing the answers to these questions as I'm hoping to dive into shaders over the summer.

EDIT: Then again, I doubt that would make much of a difference as your not using the Color variable in the pixel shader anyway.


yeah it should've been a float4... don't know why I never noticed that, still I don't use it anymore I'll remove it, I removed it from the arguments and still same effect as I figured it would.

[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
Can you post the original texture?
I'm not sure on how this works, but do you set the Texture in your applications code? if so, I'll ask the obvious, are you setting the texture? ^^
Also, shouldn't you be passing just the x and y values of uv as the second parameter of tex2d?

Out.Color = tex2D(Texture, UV.xy);
Yes I'm using effect->SetTexture("Texture", texture) to set the texture. I also tried the UV.xy and still the same effect. Give me a minute and I'll post the texture.
Photobucket
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.
Not sure on whats wrong, but i took alook at the texture your using and I can tell you, its the top row of pixels that are being streched. So the U co-ord seems to be fine but the V doesn't seem to change from 0 (resulting in only the first row of pixels being rendered). Hope this is of some help in tracking whats wrong.
From what I understand, the PixelShader should recieve the VertexShader output so you might need to rewrite the function headers. Also not sure if this would make a difference but I'm sure your aware a UV is only 2 floats, so you could try using float2 instead of float4. I just wrote this, with those changes:

float4x4 WorldViewProjection : WORLDVIEWPROJECTION;struct VS_OUTPUT{    float4 Pos : POSITION;     float2 UV  : TEXCOORD0;};struct PS_OUTPUT{    float4 Color : COLOR0;};sampler2D Texture;// I think this would assume your vertex dec holds position and then a uvVS_OUTPUT VShader(float4 Pos : POSITION, float2 UV  : TEXCOORD0){    VS_OUTPUT Out;    Out.Pos = mul(Pos, WorldViewProjection);    Out.UV  = UV;    return Out;}PS_OUTPUT PShader(VS_OUTPUT inputFromVS){    PS_OUTPUT Out;    Out.Color = tex2D(Texture, inputFromVS.UV);    return Out;}technique RenderScreen{    pass P0    {        vertexshader = compile vs_2_0 VShader();        pixelshader = compile ps_2_0 PShader();    }}


I've never done anything shader related so this may or may not work, worth a try. You might want to look at the shader part of Riemers tutorial. It seems as though a sampler is not a texture, you need both a texture and a sampler.
Quote:Original post by cNoob
From what I understand, the PixelShader should recieve the VertexShader output so you might need to rewrite the function headers. Also not sure if this would make a difference but I'm sure your aware a UV is only 2 floats, so you could try using float2 instead of float4. I just wrote this, with those changes:

*** Source Snippet Removed ***

I've never done anything shader related so this may or may not work, worth a try. You might want to look at the shader part of Riemers tutorial. It seems as though a sampler is not a texture, you need both a texture and a sampler.


OK changed my efect file to this to add in the sampler
float4x4 WorldViewProjection : WORLDVIEWPROJECTION;struct VS_OUTPUT{    float4 Pos : POSITION;     float2 UV  : TEXCOORD0;};struct PS_OUTPUT{    float4 Color : COLOR0;};Texture Tex;sampler coloredtexture = sampler_state{    Texture = <Tex>;    magfilter = LINEAR;    minfilter = LINEAR;    mipfilter = LINEAR;    AddressU  = mirror;    AddressV  = mirror;};VS_OUTPUT VShader(float4 Pos : POSITION,                  float2 UV  : TEXCOORD0){    VS_OUTPUT Out;    Out.Pos = mul(Pos, WorldViewProjection);    Out.UV  = UV;    return Out;}PS_OUTPUT PShader(float2 UV  : TEXCOORD0){    PS_OUTPUT Out;    Out.Color = tex2D(coloredtexture, UV);    return Out;}technique RenderScreen{    pass P0    {        vertexshader = compile vs_2_0 VShader();        pixelshader = compile ps_2_0 PShader();    }}


starting to think its somthing else then the shader code tho. My vertex usually consists of position+normals+texcoords, I decided to remove the normals from everything, and now instead of drawing just the top line, it's drawing just the far left side. I'm pretty sure I'm setting everything right maybe it's not reading from the vertex buffer correctly?

This is my declaration
D3DVERTEXELEMENT9 dec[3] =	{	 {0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION,0},	 {0, 12, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD,0},	 D3DDECL_END()	};


This is how I create my vertex buffer
(*m_d3ddev)->CreateVertexBuffer(sizeof(sZOLOVERTEX) * 6000,    NULL, 0, D3DPOOL_MANAGED, &m_vertex_buffer, NULL);


I use an orthographics projection that's setup like this
D3DXMATRIXA16 world, view, proj;	D3DXMatrixIdentity(&view);	D3DXMatrixOrthoLH(&proj, CGraphics->Width(), CGraphics->Height(), 0, 1);	D3DXMatrixIdentity(&world);	m_world_view_proj = world * view * proj;


I fill the vertex buffer with this
vertices[batch_size].pos     = D3DXVECTOR3(ZRP.x1, ZRP.y1, 0);			        vertices[batch_size].uv      = D3DXVECTOR2(m_iterator->sprite->tx1, m_iterator->sprite->ty1);			        vertices[batch_size + 1].pos = D3DXVECTOR3(ZRP.x2, ZRP.y2, 0);					vertices[batch_size + 1].uv  = D3DXVECTOR2(m_iterator->sprite->tx2, m_iterator->sprite->ty1);			        			        vertices[batch_size + 2].pos = D3DXVECTOR3(ZRP.x4, ZRP.y4, 0);					vertices[batch_size + 2].uv  = D3DXVECTOR2(m_iterator->sprite->tx2, m_iterator->sprite->ty2);			        vertices[batch_size + 3].pos = D3DXVECTOR3(ZRP.x1, ZRP.y1, 0);					vertices[batch_size + 3].uv  = D3DXVECTOR2(m_iterator->sprite->tx1, m_iterator->sprite->ty1);			        vertices[batch_size + 4].pos = D3DXVECTOR3(ZRP.x4, ZRP.y4, 0);					vertices[batch_size + 4].uv  = D3DXVECTOR2(m_iterator->sprite->tx2, m_iterator->sprite->ty2);			        vertices[batch_size + 5].pos = D3DXVECTOR3(ZRP.x3, ZRP.y3, 0);					vertices[batch_size + 5].uv  = D3DXVECTOR2(m_iterator->sprite->tx1, m_iterator->sprite->ty2);


any ideas?
[ dev journal ]
[ current projects' videos ]
[ Zolo Project ]
I'm not mean, I just like to get to the point.

This topic is closed to new replies.

Advertisement