• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By a.gene
      My name is Aaron. I have been a software engineer/ programmer going on 8 years working with languages such as C++,C#, Java, JavaScript. Most of my development experience has been backend development. I always wanted to get envolved in gaming development just never found the time. I am a quick learner and looking for a experience where I could work with a team to learn about game development and also help create cool games.
      I could contribute 15-20 hrs a week mostly in the evenings I am currently located in Washington DC. If you have any projects that you are looking for some help on or think I have skills that could help please don't hesitate to reach out.
    • By isu diss
      I'm following rastertek tutorial 14 (http://rastertek.com/tertut14.html). The problem is, slope based texturing doesn't work in my application. There are plenty of slopes in my terrain. None of them get slope color.
      float4 PSMAIN(DS_OUTPUT Input) : SV_Target { float4 grassColor; float4 slopeColor; float4 rockColor; float slope; float blendAmount; float4 textureColor; grassColor = txTerGrassy.Sample(SSTerrain, Input.TextureCoords); slopeColor = txTerMossRocky.Sample(SSTerrain, Input.TextureCoords); rockColor = txTerRocky.Sample(SSTerrain, Input.TextureCoords); // Calculate the slope of this point. slope = (1.0f - Input.LSNormal.y); if(slope < 0.2) { blendAmount = slope / 0.2f; textureColor = lerp(grassColor, slopeColor, blendAmount); } if((slope < 0.7) && (slope >= 0.2f)) { blendAmount = (slope - 0.2f) * (1.0f / (0.7f - 0.2f)); textureColor = lerp(slopeColor, rockColor, blendAmount); } if(slope >= 0.7) { textureColor = rockColor; } return float4(textureColor.rgb, 1); } Can anyone help me? Thanks.

    • By Iain Knights
      When seeking a composer for your games, what is it that you will typically look for when hiring someone? What about their music makes you want to employ them? what do you look for in regards to professionality? I'm really curious as i'm seeking to get my foot in the door, but i want to know what i should be doing to impress you and get commissioned! thank you!
    • By cozzie
      Hi all,
      As a part of the debug drawing system in my engine,  I want to add support for rendering simple text on screen  (aka HUD/ HUD style). From what I've read there are a few options, in short:
      1. Write your own font sprite renderer
      2. Using Direct2D/Directwrite, combine with DX11 rendertarget/ backbuffer
      3. Use an external library, like the directx toolkit etc.
      I want to go for number 2, but articles/ documentation confused me a bit. Some say you need to create a DX10 device, to be able to do this, because it doesn't directly work with the DX11 device.  But other articles tell that this was 'patched' later on and should work now.
      Can someone shed some light on this and ideally provide me an example or article on  how to set this up?
      All input is appreciated.
    • By stale
      I've just started learning about tessellation from Frank Luna's DX11 book. I'm getting some very weird behavior when I try to render a tessellated quad patch if I also render a mesh in the same frame. The tessellated quad patch renders just fine if it's the only thing I'm rendering. This is pictured below:
      However, when I attempt to render the same tessellated quad patch along with the other entities in the scene (which are simple triangle-lists), I get the following error:

      I have no idea why this is happening, and google searches have given me no leads at all. I use the following code to render the tessellated quad patch:
      ID3D11DeviceContext* dc = GetGFXDeviceContext(); dc->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_4_CONTROL_POINT_PATCHLIST); dc->IASetInputLayout(ShaderManager::GetInstance()->m_JQuadTess->m_InputLayout); float blendFactors[] = { 0.0f, 0.0f, 0.0f, 0.0f }; // only used with D3D11_BLEND_BLEND_FACTOR dc->RSSetState(m_rasterizerStates[RSWIREFRAME]); dc->OMSetBlendState(m_blendStates[BSNOBLEND], blendFactors, 0xffffffff); dc->OMSetDepthStencilState(m_depthStencilStates[DSDEFAULT], 0); ID3DX11EffectTechnique* activeTech = ShaderManager::GetInstance()->m_JQuadTess->Tech; D3DX11_TECHNIQUE_DESC techDesc; activeTech->GetDesc(&techDesc); for (unsigned int p = 0; p < techDesc.Passes; p++) { TerrainVisual* terrainVisual = (TerrainVisual*)entity->m_VisualComponent; UINT stride = sizeof(TerrainVertex); UINT offset = 0; GetGFXDeviceContext()->IASetVertexBuffers(0, 1, &terrainVisual->m_VB, &stride, &offset); Vector3 eyePos = Vector3(cam->m_position); Matrix rotation = Matrix::CreateFromYawPitchRoll(entity->m_rotationEuler.x, entity->m_rotationEuler.y, entity->m_rotationEuler.z); Matrix model = rotation * Matrix::CreateTranslation(entity->m_position); Matrix view = cam->GetLookAtMatrix(); Matrix MVP = model * view * m_ProjectionMatrix; ShaderManager::GetInstance()->m_JQuadTess->SetEyePosW(eyePos); ShaderManager::GetInstance()->m_JQuadTess->SetWorld(model); ShaderManager::GetInstance()->m_JQuadTess->SetWorldViewProj(MVP); activeTech->GetPassByIndex(p)->Apply(0, GetGFXDeviceContext()); GetGFXDeviceContext()->Draw(4, 0); } dc->RSSetState(0); dc->OMSetBlendState(0, blendFactors, 0xffffffff); dc->OMSetDepthStencilState(0, 0); I draw my scene by looping through the list of entities and calling the associated draw method depending on the entity's "visual type":
      for (unsigned int i = 0; i < scene->GetEntityList()->size(); i++) { Entity* entity = scene->GetEntityList()->at(i); if (entity->m_VisualComponent->m_visualType == VisualType::MESH) DrawMeshEntity(entity, cam, sun, point); else if (entity->m_VisualComponent->m_visualType == VisualType::BILLBOARD) DrawBillboardEntity(entity, cam, sun, point); else if (entity->m_VisualComponent->m_visualType == VisualType::TERRAIN) DrawTerrainEntity(entity, cam); } HR(m_swapChain->Present(0, 0)); Any help/advice would be much appreciated!
  • Advertisement
  • Advertisement

Recommended Posts

Hi! I am trying to implement simple SSAO postprocess. The main source of my knowledge on this topic is that awesome tutorial.

But unfortunately something doesn't work... And after a few long hours I need some help. Here is my hlsl shader:

float3 randVec = _noise * 2.0f - 1.0f; // noise: vec: {[0;1], [0;1], 0} 
float3 tangent = normalize(randVec - normalVS * dot(randVec, normalVS)); 
float3 bitangent = cross(tangent, normalVS); 
float3x3 TBN = float3x3(tangent, bitangent, normalVS); 
float occlusion = 0.0; 
for (int i = 0; i < kernelSize; ++i) 
	float3 samplePos = samples[i].xyz; // samples: {[-1;1], [-1;1], [0;1]} 
	samplePos = mul(samplePos, TBN); 
	samplePos = positionVS.xyz + samplePos * ssaoRadius; 
	float4 offset = float4(samplePos, 1.0f); 
	offset = mul(offset, projectionMatrix); 
	offset.xy /= offset.w; 
	offset.y = -offset.y; 
	offset.xy = offset.xy * 0.5f + 0.5f; 
	float sampleDepth = tex_4.Sample(textureSampler, offset.xy).a; 
	sampleDepth = vsPosFromDepth(sampleDepth, offset.xy).z; 
	const float threshold = 0.025f; 
	float rangeCheck = abs(positionVS.z - sampleDepth) < ssaoRadius ? 1.0 : 0.0; 
	occlusion += (sampleDepth <= samplePos.z + threshold ? 1.0 : 0.0) * rangeCheck; 
occlusion = saturate(1 - (occlusion / kernelSize));

And current result: http://imgur.com/UX2X1fc

I will really appreciate for any advice!

Edited by gsc

Share this post

Link to post
Share on other sites

I don't see this very important bit of math to transform/scale your normals:

vec3 normal = normalVS.xyz * 2.0 - 1.0;
normal = normalize(normal);

I pulled this from the tutorial you referenced (although, if you arent using Normal Maps, you might not need this)

Edited by missionctrl
changed code to match OPs code

Share this post

Link to post
Share on other sites

@missionctrl yes, I have that part of math, I am using a buffer with normals in view space (look at the attachment)

@KarimIO hmmm... if the shader hasn't any mistakes, then my data from buffers should be wrong...

Generally, I have normal buffer ( and depth buffer to reconstruct VS of geometry, but for test purpose I created special position buffers to eliminate possibility of mistakes in calculations from the depth buffer.

All buffers and results here: http://imgur.com/a/nm0bA

I have no idea, why version with positions read directly from the buffer is so noisy and the second one hasn't virtually any noise.

Edited by gsc

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement