GrayScale

Members
  • Content count

    19
  • Joined

  • Last visited

Community Reputation

948 Good

About GrayScale

  • Rank
    Member
  1. A* Pathfinding for Beginners

  2. I switched to the 32 bit depth buffer that did not work. After further investigation I discovered that there's z-fighting or depth fighting going on. Some of the color from the other side of the cube is winning the depth test some how. I've tried some of the more basic methods for resolving the problem like adjusting the far and near planes. That did not work. So far only adjusting the perspective's  field of view from D3DX_PI/4.f to D3DX_PI/3.f works but that alters the shape of the model abit too much. Any solutions? I'm going to try enabling backface culling next and adjusting the models winding order.   Edit: Okay so yeah applying back face culling was the solution. Thanks for the help Phil T.   Can anyone answer the second problem, about the model not rendering when the  camera sits directly above it, along the y-axis?
  3. The surface formats are all 4 8bit channels if( FAILED(graphics->device()->CreateTexture( w, h, 1, D3DUSAGE_RENDERTARGET,         D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &g_colorRT, 0 ) ) )         return false;     if( FAILED(graphics->device()->CreateTexture( w, h, 1, D3DUSAGE_RENDERTARGET,         D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &g_normalRT, 0 ) ) )         return false;     if( FAILED(graphics->device()->CreateTexture( w, h, 1, D3DUSAGE_RENDERTARGET,         D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &g_depthRT, 0 ) ) )         return false;     if( FAILED(graphics->device()->CreateTexture( w, h, 1, D3DUSAGE_RENDERTARGET,         D3DFMT_A8R8G8B8, D3DPOOL_DEFAULT, &g_lightRT, 0 ) ) )         return false; I've also tried 4 16bit channels, D3DFMT_A16B16G16R16. but that same line is there. Here is the artifact again carrying over to the light shader. The light is on one side of the model and the camera is on the oppiste side pointing back at the model. So the screen should be completely black.   I've tried using PIX but only manged to get it working once. I'll have another look at it.
  4. Phil T, Thanks for reply.   The image above is a snap shot of the model rotating along the y-axis, sitting at the origin. here's the setup code: D3DXMATRIX    origin, world, view, projection, worldViewProjection, inverseView, rotationX, rotationY;     D3DXMatrixIdentity( &world );     D3DXMatrixIdentity( &origin );     D3DXMatrixTranslation( &world, 0, 0, 0 );     D3DXMatrixRotationX( &rotationX, SMYUTI_DegreeToRadian( 0.f ) );     D3DXMatrixRotationY( &rotationY, SMYUTI_DegreeToRadian( ang+=( 50.f/fps ) ) );     D3DXMatrixPerspectiveFovLH( &projection, D3DX_PI/4.f, float(SCREEN_WIDTH)/float(SCREEN_HEIGHT), 1.f, 500.f );     D3DXMatrixLookAtLH( &view, &CameraPosition, &D3DXVECTOR3(0,0,0), &D3DXVECTOR3(0,1,0) );     D3DXMatrixInverse( &inverseView, 0,  &(view*projection) );     world                = origin * rotationX * rotationY * world;     worldViewProjection    = world * view * projection; Also when you say point sampling, you mean this, right: dev->SetSamplerState(0, D3DSAMP_MINFILTER,D3DTEXF_POINT); dev->SetSamplerState(0, D3DSAMP_MAGFILTER,D3DTEXF_POINT); dev->SetSamplerState(0, D3DSAMP_MIPFILTER,D3DTEXF_POINT); dev->SetSamplerState(0, D3DSAMP_ADDRESSU, D3DTADDRESS_CLAMP); dev->SetSamplerState(0, D3DSAMP_ADDRESSV, D3DTADDRESS_CLAMP); dev->SetRenderState( D3DRS_MULTISAMPLEANTIALIAS , FALSE ); Everything in the g-buffer looks fine except in the normal buffer of the g-buffer. I know for fact this is where the problem is at. So I too am assuming the issue is with the way that I store the normal data. I read somewhere that you could store data In view space to futher reduce the artifacts. I tried this and failed? Would you know how to, and if so, care to elaborate?
  5. Hi,   For about a week now I've been learning deferred shading. Things were running smooth until I ran into a couple snags. while implementing my shaders(3.0) I have been getting strange lighting artifacts and I managed to narrow it down to the normal map/buffer of the g-buffer: If you look along the left edge of the model you'll see the artifact( orangish color ). I've notice that this is a common problem with deferred shaders and have not mange to find any way to resolve the issue. I've tried disabling multi-sampling/anti-aliasing, adjusting the filters to none, point, linear, antistropic, and adjusting the pixel cooridantes to match the texel offset( -=.5/screenWidth, -=.5/screenHeight). Different techniques only minimize the artifacts.   So my first question is, how do you combat lighting artifacts.   Another problem that I have is that when the camera sits above or below the model, along the y-axis, the model is not rendered. Why is this?   If I adjust the camera's x or y value the slightest, say .0001f, then the model is rendered. Is this another down sides of deferred shading?   shader code:   [g-buffer] /////////////////////////////////////////// //    G L O B A L     V A R I A B L E S /////////////////////////////////////////// float4x4    gWorld; float4x4    gWorldViewProjection; float        gSpecularIntensity; float        gSpecularPower; sampler2D    gColorMap; /////////////////////////////////////////// vsOutput    vsDeferredShaderGeometryBuffer( vsInput IN ) {     vsOutput OUT    = (vsOutput)0;     OUT.position    = mul( float4(IN.position,1.f), gWorldViewProjection );     OUT.texcoord    = IN.texcoord;     OUT.normal        = normalize(mul( IN.normal, (float3x3)gWorld ));     OUT.depth.x        = OUT.position.z;     OUT.depth.y        = OUT.position.w;     return OUT; } /////////////////////////////////////////// psOutput    psDeferredShaderGeometryBuffer( vsOutput IN ) {     psOutput OUT    = (psOutput)0;     OUT.color.rgb    = tex2D( gColorMap, IN.texcoord );     OUT.color.a        = 1;     OUT.normal.xyz    = IN.normal * .5f + .5f;     OUT.normal.z    = 0;     OUT.depth        = IN.depth.x / IN.depth.y;     return OUT; } [g-buffer]   [lighting shader] /////////////////////////////////////////// //    G L O B A L     V A R I A B L E S /////////////////////////////////////////// //float4        gAmbient; //float4        gLightAmbient; //float4        gMaterialAmbient; float4x4    gInverseViewProjection; float4        gLightDiffuse; float4        gMaterialDiffuse; float3        gLightDirection; float3        gCameraPosition; float        gSpecularIntensity; float        gSpecularPower; sampler2D    gColorMap; sampler2D    gNormalMap; sampler2D    gDepthMap; /////////////////////////////////////////// vsOutput vsDeferredShaderDirectionalLighting( vsInput IN ) {     vsOutput OUT    = (vsOutput)0;     OUT.position    = float4( IN.position, 1.f );     OUT.texcoord    = IN.texcoord;// - float2( .5/800, .5/600 );     return OUT; } /////////////////////////////////////////// float4    psDeferredShaderDirectionalLighting( vsOutput IN ) : COLOR {         float4 pixel                = tex2D( gColorMap, IN.texcoord );         if( (pixel.x+pixel.y+pixel.z) <=0 )  return pixel;     float3 surfaceNormal        = (tex2D( gNormalMap, IN.texcoord )-.5f)*2.f;     float4 worldPos                = 0;         worldPos.x                = IN.texcoord.x * 2.f - 1.f;         worldPos.y                = -( IN.texcoord.y * 2.f - 1.f );         worldPos.z                = tex2D( gDepthMap, IN.texcoord ).r;         worldPos.w                = 1.f;         worldPos                = mul( worldPos, gInverseViewProjection );         worldPos                /= worldPos.w;              //if( surfaceNormal.r + surfaceNormal.g + surfaceNormal.b <= 0 )     //    return 0 ;              float lightIntensity        = saturate( dot( surfaceNormal, -normalize(gLightDirection) ) );     float specularIntensity        = saturate( dot( surfaceNormal, normalize(gLightDirection)+(gCameraPosition-worldPos)));     float specularFinal            = pow( specularIntensity, gSpecularPower ) * gSpecularIntensity;     //float4 ambient                = ((gAmbient+gLightAmbient)*gMaterialAmbient);     return float4( ( gMaterialDiffuse * gLightDiffuse * lightIntensity).rgb, specularFinal); }; [lighting shader]   Thanks in advance  
  6. Hi,   I recently learned how to render anti-alias lines using the Wu algorithm. All was going well until I decided to change the background color from black to another color, and the results changed to this:   [attachment=17464:error01.PNG]   from this:   [attachment=17465:error02.PNG]   To solve the problem I tried to alpha blend the resulting pixel, calculated with the Wu algorithm, with the background pixel underneath it. The results did not look any better. So, is there any way to modify the Wu algorithm to take in account the background pixel when calculating the levels of intensity for the new pixel of the anti-alias line? This article( http://www.codeproject.com/Articles/13360/Antialiasing-Wu-Algorithm ) does it successfully but I can't duplicate the results because I have trouble deciphering the source code. Also are there any smoothing techniques that I could use to smooth out the lines? Because if you look at the lines of the black image you will notice a candy cane like effect on some of the lines.   Here is some source code to help under stand what's going on: if( fabs(delta.x) >= fabs(delta.y) ) // line is more horizontal { if( p0.x > p1.x ) // process left to right { SmyVectorF temp = p0; p0 = p1; p1 = temp; delta = SmyVectorF( p1.x-p0.x, p1.y-p0.y ); } int x0 = (int)floor(p0.x); // start x int x1 = (int)floor(p1.x); // end x float y = p0.y; float gradient = delta.y / delta.x; // slope of the line for( int x = x0; x < x1; x++ ) { if( x >= 0 && x < w && y >= 0 && y <= h ) { float intensity1 = y-float(floor(y)); float intensity0 = 1.f-intensity1; UNINT pixel = (UNINT) (floor(y)*w)+x; backbuffer->bits[pixel] = SMYUTI_rgb( UNCHR(red*intensity0), UNCHR(green*intensity0), UNCHR(blue*intensity0) ); backbuffer->bits[(pixel+=w)] = SMYUTI_rgb( UNCHR(red*intensity1), UNCHR(green*intensity1), UNCHR(blue*intensity1) ); } y += gradient; } if( p0.y >= 0 && p0.x >= 0 && p0.y <= h && p0.x <= w ) backbuffer->bits[int(p0.y)*w+int(p0.x)] = m_color; // render start pixel if( p1.y >= 0 && p1.x >= 0 && p1.y <= h && p1.x <= w ) backbuffer->bits[int(p1.y)*w+int(p1.x)] = m_color; // render end pixel } Thanks in advance.
  7. Need help solving GJK woes

    Yup, you're right about the minus sign, though. Anyhow, the code works fine now and all the bugs are gone. I will work on the optimization on a later date. Thanks for the help.
  8. Need help solving GJK woes

      Wow! awesome, I thank you. Weeks of frustration summed up by something so trivial... a single minus sign. Does this have to do with the whole left-handedness and right-handedness thing? This is how I calculated the normal before: SMYUTI_Vector    normal() {         return SMYUTI_Vector( -this->y, this->x, this->z ); } Then I swapped the minus sign just as you suggested and voila, no bugs. Care to explain why this is? I read in the book, Mathematics and Physics for Game Programming that it does not matter in which component you place the minus sign when calculating the normal, clearly it does. Also what optimizations can be made? I thought the optimization was check outside of edge AB and edge AC, and/or vertex A, If not there then the origin is enclosed. Is it that both sides of each of the edges are being checked? If so, I figured it was for orientation, so that you know which direction you're facing. Nevertheless, Thanks Again for the help.