Jump to content

  • Log In with Google      Sign In   
  • Create Account

jonathantompson

Member Since 11 Aug 2009
Offline Last Active Apr 08 2011 02:01 PM

Posts I've Made

In Topic: SSAO problems! Please help!

24 March 2011 - 02:33 PM

OK, now I'm getting somewhere...

My camera far plane was 2.5, my near plane was 0.01. All my objects were very close to the camera and they were all very small (<0.01 in width!). The view space coordinates were correct, it was just a matter of scaled values. The R and G and B values were therefore < 1, which is why everything just looked black.

I was right, I was doing something stupid.

So, I've made all my world objects 100x larger, and changed the camera near and far planes to 1.0 and 250.0. Now the view space coordinates look like this:

View Space Buffer:
Posted Image


Occlusion buffer:
Posted Image


So there's something wrong with the occlusion buffer, but at least I know the inputs are now OK. I'll continue to debug and get back to you guys.

In Topic: SSAO problems! Please help!

24 March 2011 - 02:06 PM

Hmm... I guess I'm having trouble working out why it should look like that:

Firstly, I think if (viewpos.x, viewpos.y, viewpos.z) are stored as (R, G, B ) , then shouldn't objects further away from the camera have a stronger blue component?

In your view space rendering it looks like blue is constant across the low portion of the screen. When you're rendering the position data, are you rendering like this?:

	// Sample the texture
	float3 xyz = tex2D(PPSampSource, tex0).xyz; 
	return float4(xyz, 1.0);


Or are you rendering some other representation of the data? As a side note; what format are you using to store your view-space position buffer? I'm guessing floating point...

I'm really stuck on this. I feel like I must be doing something very stupid.

In Topic: SSAO problems! Please help!

24 March 2011 - 12:44 PM

Thanks again for your help.

Does view space position need to be normalized form 0 to 1? That's the only way I can imagine that view space positions would give 4 colored squares. Otherwise, some view values will be negative (DirectX viewspace x and y is -1 to 1 I think), which will be dark on screen.

Does depth need to be normalized as well?

Sorry for bombarding you with questions... and I appreciate the help.

In Topic: SSAO problems! Please help!

24 March 2011 - 12:30 PM

I'm pretty sure it's correct. Stepping through the code, the matrices look ok. I even calculated the transforms in Matlab and compared it against the pixel shader results in PIX.

This is the code that sets the view & project matrices per frame:

void camera::Update(void)
{
	m_vUp = g_UI->GetSetting<D3DXVECTOR3>(&var_startingUp);
	util::GetCorrectUp(&m_vLookAtPt, &m_vEyePt, &m_vUp); // To correct for arbitrary eye direction

	D3DXMatrixLookAtLH(&m_matView,&m_vEyePt,&m_vLookAtPt,&m_vUp);

	// Fit near and far to scene objects's world bounds.
	// Important for cascaded shadow maps to reduce aliasing.
	g_objectManager->FitViewMatrixNearFarToRBObjects(&m_matView, &zNearFar.x, &zNearFar.y, zNearMin, zFarMax);

	zNearToFar = zNearFar.y - zNearFar.x;

	g_UI->GetWidthHeight(&width, &height);
	D3DXMatrixPerspectiveFovLH(&m_matProjection, 
                       		g_UI->GetSetting<float>(&var_fieldOfView), 
			                   (float)width / (float)height, 
			                   zNearFar.x, zNearFar.y);

	m_ViewProj = m_matView * m_matProjection;
}//Update

This is the code that sets the matrices and draws a meshed object:

void renderer::DrawTexturedMeshPosNorm(rbobjectMeshData * meshData, D3DXMATRIXA16 * matWorld)
{
	UINT numPasses = 0;    
        HR(m_FX->Begin(&numPasses, 0),L"Render::DrawTexturedMeshPosNorm() - m_FX->Begin Failed: ");    
        HR(m_FX->BeginPass(0),L"Render::DrawTexturedMeshPosNorm() - m_FX->BeginPass Failed: ");


	D3DXMATRIX WV;
	D3DXMatrixMultiply(& WV, matWorld, & g_objectManager->GetCamera()->m_matView ); // matWorld is current RBO model->world transform
	HR(m_FX->SetMatrix(m_FXHandles.m_hWV, & WV), L"Render::SetPosNormMatricies() - Failed to set m_hWV matrix: "); 

	D3DXMATRIX WVP;
	D3DXMatrixMultiply(& WVP, & WV, & g_objectManager->GetCamera()->m_matProjection ); // matWorld is current RBO model->world transform
	HR(m_FX->SetMatrix(m_FXHandles.m_hWVP, & WVP), L"Render::SetPosNormMatricies() - Failed to set m_hWVP matrix: "); 


	HR(m_FX->CommitChanges(),L"Render::DrawTexturedMeshPosNorm() - CommitChanges failed: ");

	for(UINT j = 0; j < meshData->materials->Size(); ++j)
	{
		HR(meshData->pMesh->DrawSubset(j),L"Render::DrawTexturedMeshPosNorm() - DrawSubset failed: ");
	}

	HR(m_FX->EndPass(),L"Render::DrawTexturedMeshPosNorm() - m_FX->EndPass Failed: ");
	HR(m_FX->End(),L"Render::DrawTexturedMeshPosNorm() - m_FX->End Failed: ");

}

In Topic: DX9 Variance SM - Texture Hardware Filter Issues!

24 March 2011 - 09:50 AM

Just as an update to anyone who come across this:

I was never able to work out how to get floating point texture filtering on my 7800GTX machine working. Caps says it is supported. When enabled no DX9 warning or error is thrown, yet it just doesn't work.

I've tried the same code on other cards (an ATI 5850) and the filtering looks great. For now, I'm using bilinear filtering in the shader on the 7800GTX machine as a work around:

// Bilinear interpolation texture lookup
float4 tex2DBilinear( sampler textureSampler, float2 uv )
{
    float4 tl = tex2D(textureSampler, uv);
    float4 tr = tex2D(textureSampler, uv + float2(gTexelSize, 0));
    float4 bl = tex2D(textureSampler, uv + float2(0, gTexelSize));
    float4 br = tex2D(textureSampler, uv + float2(gTexelSize , gTexelSize));
    float2 f = frac( uv.xy * gTextureSize ); // get the decimal part
    float4 tA = lerp( tl, tr, f.x ); // will interpolate the red dot in the image
    float4 tB = lerp( bl, br, f.x ); // will interpolate the blue dot in the image
    return lerp( tA, tB, f.y ); // will interpolate the green dot in the image
}


It's slower, but at least I get some shadow map filtering!

PARTNERS