zhugel_007

Members
  • Content count

    15
  • Joined

  • Last visited

Community Reputation

126 Neutral

About zhugel_007

  • Rank
    Member
  1. is it possible to render the stencil value into a texture and read the value later from the texture?
  2. Normal map question

    If understand correctly, world/object space normal map are using three channels and tangent space normal map is using 2 channels? If there is not much performance penalty by using 2 channels as Matt said, what is the benefit by using world/object space normal map?
  3. Normal map question

    That make sense. Thanks for all your answers! :)
  4. Hi all, I have a quick question of normal map. As I know, normal map (not bump map) uses 3 channels. red and green channel are for binormal and tangent. what is blue channl for? Normal could be calculated from the binormal and tangent, right?
  5. volumetric light

    Thanks a lot Krokhin! :)
  6. volumetric light

    Hi Krokhin, your demo looks great! Could you please explain more about the use of the cubemap? I am still confused of how you use the cubemap. Thanks a lot!
  7. volumetric light

    Hi, I would like to implement volumetric light. I have found following methods: 1. shadow map + sampling plane. (http://ati.amd.com/developer/gdc/mitchell_lightshafts.pdf) (http://nis-ei.eng.hokudai.ac.jp/~doba/papers/pg00.pdf) (ShaderX3) 2. ray tracing based (Nvidia SDK 10.5) 3. image process based (GPU Gem 3) (ShaderX6) But they all have limits: 1. when you go close to the light volume: - you can see the sampling planes - framerate goes down quickly 2. slow in large unit 3. light beam disappears if the light goes out of the screen It seems that only 3 has been used in Farcry2 and Crysis(only in open area). Other methods are not even used in any game. I might be wrong. :p Is there any other way to do the volumetric light? Any suggestion would be appreciated. :)
  8. Thanks Jason for your help :) I was actually using Rendermonkey to test a shader. I will try to find out where is the problem. Thanks again!
  9. Thanks Jason for your answer. But it doesn't work. :( I guess maybe it is because the result is too small to be saved in the 8bit texture? OutColor.x = frac( DepthValue / scalar ); OutColor.y = frac( DepthValue / (scalar * scalar) ; OutColor.z = frac( DepthValue / (scalar * scalar * scalar) ) ;
  10. Hi Jason, I have tried your method, it seems doesn't work. here is the shader to encode the depth value: float scalar = 32.0f; float3 OutColor; OutColor.x = frac( DepthValue / scalar ) * scalar; OutColor.y = frac( DepthValue / (scalar * scalar) ) * scalar; OutColor.z = (frac( DepthValue / (scalar * scalar * scalar) ) * scalar); here is the shader to get the value back: float4 f= tex2D(Tex, In.texCoord); float scalar = 32.0f; float depth = f.x *scalar + f.y*scalar*scalar + f.z*scalar*scalar*scalar; Did i do something wrong? What i am confused is the floating point value in the shader. Will the bit shifting work for floating point value? Thanks!
  11. Hi Jason Z, Thanks for your reply. Could you please explain more about how to get back the floating point value and the theory behind this method? Thank you very much!
  12. Hi, I would like to save depth value in to a texture. Since on some graphic card, floating point texture doesn't support alpha blend, I am planning to write the depth value into a regular 32bit RGBA texture. Since one channel only has 8bit which is too less, I want to use the lower N bit in each channel (RGB, not A, I need A for blending.) to encode the depth value. (N<8) for example, I could use 5bit from each channel to save 15bit depth value. I've tried some methods, but no one seems to be correct. I was wondering what should be the correct way to do this? Thanks a lot!
  13. calculate depth value in vertex shader

    Thanks for all your replies. Jason is right, i need the linear distance. i've done the shader in a different way and it seems works correctly. float4x4 matViewProjection; float near; float far; struct VS_INPUT { float4 Position : POSITION0; }; struct VS_OUTPUT { float4 Position : POSITION0; float Depth : TEXCOORD0; }; VS_OUTPUT vs_main( VS_INPUT Input ) { VS_OUTPUT Output; Output.Position = mul( Input.Position, matViewProjection ); // render the depth value into a texture. // the w value is the real z after the projection. // here is to adjust the z value into range [0,1] Output.Depth = (Output.Position.w-near)/(far-near); return( Output ); } float4 ps_main(float Depth: TEXCOORD0) : COLOR0 { return Depth; }
  14. calculate depth value in vertex shader

    Thanks MJP and akm_masuduzzaman for your quick reply! :) I've tried your shader, it seems still has the same problem. :( I was wondering why should i do the perspective divide after interpolation? Could you please explain more? and i am a bit confused. if the object is close to the camera, the z value should be almost 0 so that the color in the depth texture should be almost black, right? Thanks!
  15. Hi, I am currently working on the volumetric fog. I need to render the scene depth into a texture so that i could use it to calculate the thickness of the fog. I've got some problem of render the depth value into the texture. Here are the vertex&pixel shaders : float4x4 matViewProjection; struct VS_INPUT { float4 Position : POSITION0; }; struct VS_OUTPUT { float4 Position : POSITION0; float Depth : TEXCOORD0; }; VS_OUTPUT vs_main( VS_INPUT Input ) { VS_OUTPUT Output; Output.Position = mul( Input.Position, matViewProjection ); Output.Depth = (Output.Position.z/Output.Position.w); return( Output ); } float4 ps_main(float Depth: TEXCOORD0) : COLOR0 { return Depth; } But when i check the texture, it is almost purely white. :( I was thinking the depth value should be in the range [0,1] so that at least the texture shouldn't be that white... Did i do anything wrong? Can someone give me a hint? Thanks in advance. :)