Sign in to follow this  
Slim_d0g

Problem with shader from RenderMonkey (C#)

Recommended Posts

Hy everybody! I've tried to make sky with clouds in my game. I took shader from RenderMonkey (BlueSky from Atmospheric.rfx) and tried to use it, but it works incorrectly! For presentation of my problem I've pasted shader on cube. Here is a screenshot of this shader in RenderMonkey (how it should work no cube) and in my engine (how it works). Please, help me to find out where is the problem! here's shader:
//--------------------------------------------------------------//
// BlueSky
//--------------------------------------------------------------//
//--------------------------------------------------------------//
// Sky
//--------------------------------------------------------------//

float4x4 view_proj_matrix : ViewProjection;
float skyZBias = float( -0.2 );
float4 view_position : ViewPosition;
float skyScale = float( 1260.20 );
float scale = float( 0.09 );

float4 cloudColor = float4( 1.00, 1.00, 1.00, 1.00 );
float4 skyColor = float4( 0.00, 0.50, 1.00, 1.00 );
float noiseScale = float( 2.30 );
float noiseBias = float( -0.60 );
float4 lightDir = float4( 0.92, 0.28, 0.27, 0.00 );
float4 sunColor = float4( 1.00, 0.88, 0.12, 1.00 );
float sunFallOff = float( 373.75 );
float sunSharpness = float( 1.02 );
float cloudSpeed = float( 0.05 );
float noiseSpeed = float( 0.12 );
float time_0_X : Time0_X;

texture Noise_Tex;
sampler Noise = sampler_state
{
   Texture = (Noise_Tex);
   ADDRESSU = WRAP;
   ADDRESSV = WRAP;
   ADDRESSW = WRAP;
   MAGFILTER = LINEAR;
   MINFILTER = LINEAR;
   MIPFILTER = LINEAR;
};

struct VS_OUTPUT {
   float4 Pos: POSITION;
   float3 texCoord: TEXCOORD0;
};

VS_OUTPUT Atmospheric_Effects_BlueSky_Sky_Vertex_Shader_main(float4 Pos: POSITION){
   VS_OUTPUT Out;

   // Get the sky in place
   Pos.z += skyZBias;
   Out.Pos = mul(float4(Pos.xyz * skyScale + view_position, 1),view_proj_matrix );
   // Pass position to the fragment shader
   Out.texCoord = Pos * scale;

   return Out;
}

float4 Atmospheric_Effects_BlueSky_Sky_Pixel_Shader_main(float3 texCoord: TEXCOORD0) : COLOR {
   // Create a sun
   float3 l = lightDir - normalize(texCoord);
   float sun = saturate(sunFallOff * pow(dot(l, l), sunSharpness));
   float4 sky = lerp(sunColor, skyColor, sun);

   // Clouds are basically noise, we just need to scale and bias it.
   texCoord.xy += cloudSpeed * time_0_X;
   texCoord.z  += noiseSpeed * time_0_X;
   float noisy = tex3D(Noise, texCoord).r;

   float lrp = noiseScale * noisy + noiseBias;

   return lerp(cloudColor, sky, saturate(lrp));
}

//--------------------------------------------------------------//
// Technique Section for Effect Workspace.Atmospheric Effects.BlueSky
//--------------------------------------------------------------//
technique BlueSky
{
   pass P0
   {
      ZWRITEENABLE = FALSE;
      CULLMODE = NONE;


      VertexShader = compile vs_1_1 Atmospheric_Effects_BlueSky_Sky_Vertex_Shader_main();
      PixelShader = compile ps_2_0 Atmospheric_Effects_BlueSky_Sky_Pixel_Shader_main();
   }
}

Share this post


Link to post
Share on other sites
Looks somewhat like wrong texture coordinates... haven't worked with shaders for a long time, but it might help if you use a texture with 4 different corners for debugging purpose.

Or try a simple pixel shader:


return tex3D(Noise, texCoord).rgba; // whatever the syntax is :D

Share this post


Link to post
Share on other sites
I've tried to change pixel shader like you said, and got the same thing!
So the problem in vertex shader or in my engine, but I can't imagine what's wrong( Vertex shader needs only vertex position from mesh...


I think that there's something wrong in this line:
Out.texCoord = Pos * scale;
but it works ok in the RenderMonkey... =\

Share this post


Link to post
Share on other sites
If it works in Rendermonkey, then there's something wrong in your code - or in Rendermonkeys code ^^

You could also try a less complex vertex shader...


VS_OUTPUT Out;

Out.Pos = mul(float4(Pos.xyz * 1000.0f + view_position, 1), view_proj_matrix);

// Or...
// Out.Pos = mul(float4(Pos.xyz * 1000.0f, 1), view_proj_matrix);
// sorry I don't get how the shader works :D just try it


// There might be something wrong with your matrix, may inversed?!
// Out.Pos = mul(view_proj_matrix, float4(Pos.xyz * 1000.0f + view_position, 1));


Out.texCoord.xyz = Pos.xyz;

return Out;

Share this post


Link to post
Share on other sites
is these parameters the same in rendermonkey:

texture Noise_Tex;
sampler Noise = sampler_state
{
Texture = (Noise_Tex);
ADDRESSU = WRAP;
ADDRESSV = WRAP;
ADDRESSW = WRAP;
MAGFILTER = LINEAR;
MINFILTER = LINEAR;
MIPFILTER = LINEAR;
};

In your engine picture it looks like a wrap, but in the rendermonkey it dosen't.

And also. how are you rendering the cube in your engine? using indexbuffer with only 8 vertices? Then you could get that "stretch" result becuase of the fact that the "walls" in the cube shares the uvcoordinates with the "ceiling" and dosen't get unique uvcoordinates.

Share this post


Link to post
Share on other sites
AndiDog
I tried your vertex shader, but got the same thing((
the problem seems to be in this line:
Out.texCoord = Pos * scale;
because other lines of vertex shader for drawing model in the right place with right scaling, but it works normal.

Keba
sampler I took from RenderMonkey... Also I tried to change it, but no effect.
I render cube without indexes.


There's my vertex declaration and mesh rendering:



private struct Vertex
{
public float x, y, z;
public float tu, tv, tw;
};


VertexElement[] velements = new VertexElement[]
{
new VertexElement(0,0,DeclarationType.Float3,DeclarationMethod.Default,
DeclarationUsage.Position,0),
new VertexElement(0,12,DeclarationType.Float3,DeclarationMethod.Default,
DeclarationUsage.TextureCoordinate,0),
VertexElement.VertexDeclarationEnd
};
vd = new VertexDeclaration(GEngine.device, velements);
mesh = mesh.Clone(MeshFlags.Managed, velements, GEngine.device);



public virtual void DrawMesh()
{
if (visible == true)
{
GEngine.device.Transform.World = GEngine.Camera.matWorld = Matrix.RotationYawPitchRoll(this.yaw, this.pitch, this.roll) * Matrix.Scaling(this.xscale, this.yscale, this.zscale) * Matrix.Translation(this.x, this.y, this.z);
for (int i = 0; i < this.meshMaterials.Length; i++)
{
if (this.meshMaterials != null)
GEngine.device.Material = this.meshMaterials[i];
if (this.meshTextures != null)
GEngine.device.SetTexture(0, this.meshTextures[i]);

if ((effect != null) && (doEffect))
{
SetupEffect(i);
int numPasses = effect.Begin(0);
for (int iPass = 0; iPass < numPasses; iPass++)
{
effect.BeginPass(iPass);

this.mesh.DrawSubset(i);
effect.EndPass();
}
effect.End();
}
else
{
this.mesh.DrawSubset(i);
}
}
}
}



p.s. I took the model from RenderMonkey...

Share this post


Link to post
Share on other sites
Sorry for the late answer.

I tried it out myself and get the same result as you. As texture I used a DDS file with mipmap levels down to 1x1 with manipulated top level to see a difference because I hoped it'd create a correct volume texture.

Anyway the changes I made to the top level get displayed on the cube's side, i.e. only the top level of my texture is used - even if I manually set the third texture coordinate to another value. So this might be the problem, a wrong created 3D texture... I must admit I don't know how to use volume textures.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this