Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


lipsryme

Member Since 02 Mar 2010
Offline Last Active Today, 03:55 AM

Topics I've Started

Generating a sine-based tiled procedural bump map

09 May 2015 - 12:30 PM

I'm trying to recreate what was proposed on this slide here:

WmmxysH.png

 

My Resulting height map:

GHx4FHA.png

 

 

My code:

const unsigned int texWidth = 512;
const unsigned int texHeight = 512;

unsigned char *data = new unsigned char[texWidth * texHeight];
srand(static_cast<unsigned int>(time(NULL)));
for (unsigned int y = 0; y < texHeight; ++y)
{
	for (unsigned int x = 0; x < texWidth; ++x)
	{
		float sum = 0;
		for (unsigned int i = 0; i < 100; ++i)
		{
			float px_rand = (float)x - (float)rand();
			float px_rand_2 = px_rand * px_rand;
			float py_rand = (float)y - (float)rand();
			float py_rand_2 = py_rand * py_rand;
			sum += sinf(sqrtf(px_rand_2 + py_rand_2) * 1.0f / (2.08f + 5.0f * (float)rand()));
		}
		unsigned char value = static_cast<unsigned char>((sum / 100) * 255) & static_cast<unsigned char>(0x00ff);
		data[y * texWidth + x] = value;
	}
}

As far as I can tell the algorithm and my code should be identical huh.png

Any ideas what I'm doing wrong ?

The idea is to generate this height map and use photoshop nvidia texture plugin to convert it to a normal map


Mapping fullscreen texture to object UV in forward pass

03 May 2015 - 10:33 AM

As the title says I'm trying to get texture coordinates in my forward pass's object pixel shader that matches the one's from a texture rendered as a fullscreen triangle/quad.

I tried computing the screen space position like so:

input.PosCS.xy /= input.PosCS.z; // in Pixel shader
return ssao.Sample(Sampler, input.PosCS.xy).r;

similar to what I'm doing to compute the screen space velocity vector but this doesn't seem to work.

 

Any ideas ? Is that not possible ? What would be the common way to apply e.g. SSAO to your material's lighting shader ?


Custom mipmap generation (Read mip0, write to mip1,...)

03 May 2015 - 04:49 AM

Hey guys I've been on this for hours now and can't seem to figure out how to generate my mipmaps with a custom downsample shader using the same texture (but different mip levels).

I've created render targets for each mip slice, which works fine however there seems to be no way to do the same for a shader resource view.

I thought okay maybe I can do that using an unordered access view, however this also doesn't seem to work since the PS just won't read anything from it without any debug layer warnings.

 

Is there any way to do this ? or is the only possible way to create N different render targets and then copy the data into the one texture as mips ?

 

This is how I generate the texture2D:

D3D11_TEXTURE2D_DESC texDesc;
ZeroMemory(&texDesc, sizeof(texDesc));
if (createTex)
{
        texDesc.Width = this->width;
        texDesc.Height = this->height;
	texDesc.MipLevels = this->mipLevels + 1;
	texDesc.ArraySize = 1;
	texDesc.Format = this->format;
	texDesc.SampleDesc.Count = this->msaa_count;
	texDesc.SampleDesc.Quality = this->msaa_quality;
	texDesc.Usage = D3D11_USAGE_DEFAULT;
	texDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_UNORDERED_ACCESS;
	texDesc.CPUAccessFlags = 0;
	texDesc.MiscFlags = 0;

	// Create the Texture2D
	HRESULT hr = this->device->CreateTexture2D(&texDesc, NULL, &this->tex);
	if (FAILED(hr))
	{
		this->log->OutputToConsole("Failed to create texture2D!", LogType::Error);
		return false;
	}
}

Here's how I generate the UAVs:

D3D11_UNORDERED_ACCESS_VIEW_DESC viewDesc = {};
viewDesc.Format = texDesc.Format;

if (texDesc.MipLevels > 1)
{
	viewDesc.ViewDimension = D3D11_UAV_DIMENSION_TEXTURE2D;

	for (unsigned int mip = 0; mip <= mipLevels; ++mip)
	{
		viewDesc.Texture2D.MipSlice = mip;
		ID3D11UnorderedAccessView *UAV = nullptr;
	        HRESULT hr = this->device->CreateUnorderedAccessView(this->tex, &viewDesc, &UAV);
		if (FAILED(hr))
		{
			return nullptr;
		}
		this->SetUnorderedAccessView(UAV, 0, mip);
	}
}
else
{
	HRESULT hr = this->device->CreateUnorderedAccessView(this->tex, NULL, &this->uav[0]);
	if (FAILED(hr))
	{
		this->log->OutputToConsole("Failed to create UAV!", LogType::Error);
		return false;
	}
}

This is my pixel shader:

Texture2D DepthTex : register(t0);
Texture2D<float> LinearDepthMipTex : register(u1);
SamplerState PointSampler : register(s0);


// Function to minify linear view space depth for miplevels
float PS_Minify(VSO input) : SV_Target0
{		
	return LinearDepthMipTex.Sample(PointSampler, input.TexCoord); // test


	//int2 ssP = input.TexCoord * float2(1280, 720);

	// Rotated grid subsampling to avoid XY directional bias or Z precision bias while downsampling
	//return LinearDepthMipTex.Load(int3(ssP * 2 + int2((ssP.y & 1) ^ 1, (ssP.x & 1) ^ 1), 0)).r;
}

Parallax-corrected cubemap blending help

29 April 2015 - 12:49 PM

I'm trying to implement seb. lagarde's parallax-corrected cubemap blending approach and have been going over this one part over and over again but I just can't seem to figure out how it works exactly.

 

This is the part I mean: (taken from https://seblagarde.wordpress.com/2012/09/29/image-based-lighting-approaches-and-parallax-corrected-cubemap/)

 

oBTGGF8.png

 

Does anyone understand how that works ? The picture on the right seems to suggest that he is mirror'ing each cubemap face's look direction over a plane (0, 1, 0)? however this is just me guessing. The blue vectors on the right that represent that mirrored view direction don't  really make sense though if you just take the cube face directions (1, 0, 0) (-1, 0, 0) (0, 1, 0).....

Also if we want to replace the world space position of the shaded location (which we don't have in the cubemap blending pass) with this vector doesn't it have to have some location to it (my current WS camera postion)?

Any help or ideas would be appreciated.


Localizing image based reflections (issues / questions)

18 April 2015 - 04:28 AM

I'm currently trying to localize my IBL probes but have come across some issues that I couldn't find any answers to in any paper or otherwise.

 

1. How do I localize my diffuse IBL (irradiance map) ? The same method as with specular doesn't really seem to work.

    The locality does not really seem to work for the light bleed.

Image%202015-04-18%20at%2012.17.10%20.pn

As you can see here the light bleed is spread over a large portion of the entire plane even if the rectangular object is very thin.

Also the red'ish tone doesn't really become increasingly stronger the closer the sphere is moved to the red object.

If I move the sphere further to the side of the thin red object the red reflection is still visible. So there's no real locality to it.

 

 

2. How do I solve the case for objects that aren't rectangular or where there's objects not entirely at the edge of the AABB that I intersect ? (or am I'm missing a line or two to do that ?)

 

EXAMPLE_PICTURE:

Image%202015-04-18%20at%2012.22.24%20.pn

As you can see here the rectangular red object reflection works perfectly (but then again only if its exactly at the edge of the AABB).

If an object is like a sphere or moved closer to the probe (so not at the edge) the reflection will still be completely flat and projected to the side of the AABB.

Here's the code snippet how I localize my probe reflection vector...

 

CODE_SAMPLE:

float3 LocalizeReflectionVector(in float3 R, in float3 PosWS, 
                                in float3 AABB_Max, in float3 AABB_Min, 
                                in float3 ProbePosition,
{
	// Find the ray intersection with box plane
	float3 FirstPlaneIntersect = (AABB_Max - PosWS) / R;
	float3 SecondPlaneIntersect = (AABB_Min - PosWS) / R;

	// Get the furthest of these intersections along the ray
	float3 FurthestPlane = max(FirstPlaneIntersect, SecondPlaneIntersect);

	// Find the closest far intersection
	float distance = min(min(FurthestPlane.x, FurthestPlane.y), FurthestPlane.z);

	// Get the intersection position
	float3 IntersectPositionWS = PosWS + distance * R;

	return IntersectPositionWS - ProbePosition;
}

PARTNERS