Jump to content

  • Log In with Google      Sign In   
  • Create Account


bombshell93

Member Since 17 Jul 2011
Offline Last Active Jun 12 2013 12:43 AM

Topics I've Started

interactive within 4 spatial dimensions

01 May 2013 - 03:35 PM

I wasn't entirely sure where to put this, but I need a concept check.
I've been bending my mind around the concept of 4 spatial dimensions and what it would mean in a gaming environment, after some thought I managed to conceive of 4 spatial dimensions, my friend got lost when I was trying to explain it so I'll try and keep it clean so you get the concept before I pose my question.

A 1D line is a series of possible states of a 0D point from the lines lower limit / start to its upper limit / end
A 2D shape is a series of possible states of a 1D line
A 3D shape a series of possible states of a 2D shape
So by extension, a 4D area is a series of possible states of a 3D shape

so if I were to take a 4D area with the 4th dimension ranging from 0 to 1. am I right in saying that if W is my 4th dimension I may accurately represent the concept of 4 spatial dimensions via Lerping 3D points between a position regarded as W of 0 and a position regarded as W of 1?

if so am I also right in saying a 4th dimension with a range exceeding 1 the 3D points would follow a curve defined by the points positions per W unit?

to my understanding these points would actually represent the shape of 3D space as opposed to the objects within 3D space, meaning an object moving at for example 10mph in W0 may in W1 be moving at 100mph and then in another location 10mph in W1 could be 100mph in W0 via relative stretching and squeezing.

I hope I didn't just confuse myself just to sound like an idiot XD
I figured the idea of 4 dimensions in which to interact would be an interesting mechanic specifically for puzzlers, but I suppose the concept in other genres would allow for interesting 4 Dimensional level design, literately adding a new depth to the game.

regarding the title about interactivity, the idea is just like WASD have you moving in X and Y, I'd imagine Q and E moving you through W, only being able to see the world in 3 dimensions it would just look like things being warped when if my concept is right it would be moving through a 4th dimension.

Thanks for reading,
Any and all informations, examples, documentation on this concept or similar concepts would be greatly appriceated.
Bombshell


My first SSAO implementation

05 April 2013 - 11:24 PM

I think I'm still suffering from haloing and some artifacts on spheres causing dark / light rings, but I'm still over the moon that I've got it working to some degree, SSAO baffled me until a moment of doing nothing and it fit itself together! best feeling in the world is jumping over a programming hurdle.
Old Version

Spoiler

EDIT:
Okay so I kept working on it and I've got it looking better without the bugs too, I made some silly mistakes, correcting them I've now had to go down to 6 samples but I'm more than happy with the quality of the occlusion, I'm using a Min BlendState so only the darkest comes through, and I'm running multiple passes giving a smoother looking final occlusion.
myfirstssao_by_pushbombshell-d60lhx8.png

as you can see the haloing issue isn't as glaring and the samples normals are now actual normals and no the samples direction (a mistake on my part) which has fixed the issue with artifacts on rounded surfaces.
 

#define SAMPLECOUNT 6

float4x4 WVP; //View Projection Matrix
float4x4 WVPI; //Inverse View Projection Matrix

float3 sampleVectors[SAMPLECOUNT];
float sampleRange;
float depthBias = 0.0f;
float PI = 3.14159265f;

float2 GBufferSize;

sampler NormalSampler : register(s2);
sampler NoiseSampler : register(s3);

struct VertexShaderInput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
	float4 ProjectPos : TEXCOORD1;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
	
	output.Position = input.Position; //using a Full Screen Quad which does not require projection
	output.UV = (output.Position.xy / output.Position.w) * float2(0.5f, -0.5f) + 0.5f; //get UV for GBuffers via screen position
	output.UV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f; //half pixel offset
	output.ProjectPos = output.Position;
	
    return output;
}

float unpack(float2 packed)
{
	const float2 conversion = float2(1.0f, 1.0f / 256.0f);
	return dot(packed, conversion);
}

float3 decode (float2 enc)
{
    float4 nn = float4(enc, 0, 0) * float4(2,2,0,0) + float4(-1,-1,1,-1);
    float l = dot(nn.xyz,-nn.xyw);
    nn.z = l;
    nn.xy *= sqrt(l);
    return nn.xyz * 2 + float3(0,0,-1);
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
	float4 normalSample = tex2D(NormalSampler, input.UV);
	float noiseSample = tex2D(NoiseSampler, input.UV * (float2(GBufferSize.x, GBufferSize.y) / 8)).r;
	float3 normal = decode(normalSample.xy); //Spheremap Transform normal compression, not my own work, based off a Cry3 implementation
	float depth = unpack(normalSample.zw); //nature of the game does not require a wide depth range so it has been compressed into 16-bit
	
	float4 position = float4(input.ProjectPos.xy / input.ProjectPos.w, depth, 1); //reconstruct position from normal
	position = mul(position, WVPI);
	position /= position.w;
	
	float4 output = float4(1,1,1,1);
	
	float angle = noiseSample * PI * 2; //convert [0-1] range noise into radians
	float cosAngle = 1-cos(angle);
	float sinAngle = sin(angle);
	
	float3 unit = normalize(normal); //I used its own variable incase I'd need to change it.
	
	float3x3 rotationMat = float3x3( //Rotation matrix to rotate the sample vector by angle
		1 + cosAngle * (unit.x * unit.x - 1), 
			-unit.z * sinAngle + cosAngle * unit.x * unit.y, 
				unit.y * sinAngle + cosAngle * unit.x * unit.z,
		unit.z * sinAngle + cosAngle * unit.x * unit.y, 
			1 + cosAngle * (unit.y * unit.y - 1), 
				-unit.x * sinAngle + cosAngle * unit.y * unit.z,
		-unit.y * sinAngle + cosAngle * unit.x * unit.z, 
			unit.x * sinAngle + cosAngle * unit.y * unit.z, 
				1 + cosAngle * (unit.z * unit.z - 1)
		);
		
	for (int i = 0; i < SAMPLECOUNT; i++)
	{
		float3 sampleVector = sampleVectors[i];
			//transform sample vector by angle around normal
		sampleVector = mul(sampleVector, rotationMat);
		sampleVector =  dot(sampleVector, normal) < 0 ? -sampleVector : sampleVector;
		sampleVector *= sampleRange;
			
			//get sample vecotors world position > projected position > UV
		float4 samplePosition = mul(float4(position.xyz + sampleVector, 1), WVP);
		float2 sampleUV = (samplePosition.xy / samplePosition.w) * float2(0.5f, -0.5f) + 0.5f;
		sampleUV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f;
			//sample depth
		float sample = unpack(tex2D(NormalSampler, sampleUV).zw);
			//modify final value by dot product
		float mod = 1 - dot(decode(tex2D(NormalSampler, sampleUV).xy), normal);
			//if sample is closer to view than origin caculate occlusion value
		if (sample < depth - depthBias)
		output -= (saturate(1 - ((depth - depthBias) - sample)) * mod) / SAMPLECOUNT;
	}
	
	return output;
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_3_0 VertexShaderFunction();
        PixelShader = compile ps_3_0 PixelShaderFunction();
    }
}
 

 

I'm mostly concerned about the haloing artifacts but the odd occlusion on spheres is a bit worrying too.

Any and all comments are greatly appreciated,
Thanks for reading,
Bombshell


Deferred Ambient Light via CubeMap, Edge Artefacts

03 April 2013 - 06:12 AM

here is a screenshot of the problem followed by the cubemap I'm using (I made a content processor to convert it to textureCube (I'm using XNA) though it does it strangley and they come out flipped so the cubemap may look odd)
rendering_artefacts_by_pushbombshell-d60
cubemap_by_pushbombshell-d60a34i.jpg
ignore the dark pixel its the models UV's, whats really ticking me off is the edge, it only happens with my ambient light shader, the directional light and the point lights have no issue, but this confused me because it seemed so much like the half-pixel offset issue.
here is my shader, all the buffers are set to PointClamp sampling.
 

sampler DiffuseSampler : register(s0);
sampler SpecularSampler : register(s1);
sampler NormalSampler : register(s2);

texture Environment;
float ambientStrength;
float2 GBufferSize;

sampler EnvironmentSampler = sampler_state
{
	texture = <Environment>;
	mipfilter = LINEAR;
	minfilter = LINEAR;
	magfilter = LINEAR;
};

struct VertexShaderInput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
};

struct VertexShaderOutput
{
    float4 Position : POSITION0;
	float2 UV : TEXCOORD0;
	float4 ProjectPos : TEXCOORD1;
};

VertexShaderOutput VertexShaderFunction(VertexShaderInput input)
{
    VertexShaderOutput output;
	
	output.Position = input.Position;
	output.UV = output.Position.xy * float2(0.5f, -0.5f) + 0.5f;
	output.UV += float2(1/GBufferSize.x, 1/GBufferSize.y) * 0.5f;
	output.ProjectPos = output.Position;
	
    return output;
}

float3 decode (float2 enc)
{
    float4 nn = float4(enc, 0, 0) * float4(2,2,0,0) + float4(-1,-1,1,-1);
    float l = dot(nn.xyz,-nn.xyw);
    nn.z = l;
    nn.xy *= sqrt(l);
    return nn.xyz * 2 + float3(0,0,-1);
}

float4 PixelShaderFunction(VertexShaderOutput input) : COLOR0
{
	float4 diffuseSample = tex2D(DiffuseSampler, input.UV);
	float4 specularSample = tex2D(SpecularSampler, input.UV);
	float4 normalSample = tex2D(NormalSampler, input.UV);
	float3 normal = decode(normalSample.xy);
	
	float AmbientIntensity = ambientStrength;
    return float4(texCUBE(EnvironmentSampler, normal).xyz * diffuseSample * AmbientIntensity, 1);
}

technique Technique1
{
    pass Pass1
    {
        VertexShader = compile vs_2_0 VertexShaderFunction();
        PixelShader = compile ps_2_0 PixelShaderFunction();
    }
}

the problem gets worse as the ambient light increases, I've done some messing around and noticed it seems worse going from a red surface to another surface so I'm convinced its getting wrong normals and so what should be the flat backdrop is getting the normals of the spheres and the ship above it, again somewhat like the half-pixel offset issue, but I've used the exact same UV getting method in my directional light shader and point light shader, with none of this issue.

if anyone notices anything, knows the problem, has some debugging advice, anything to help me around this it'd be great, about a day has been lost fiddling to find the issue.
Thanks in advanced,
Bombshell

 


use of DynamicVertexBuffer's for hardware instancing

22 March 2013 - 07:15 PM

one of my friends is on a Game Dev course in collage and his final exam is to make a game, hes allowed help on art assets and specific parts of an engine (in this case the graphics) so long as he designs and programs the player interaction and gameplay, So I'm writing him a Renderer (in XNA as its what he'd prefer to use)

The game will essentially be a vertical scrolling bullet-hell shooter, that said efficient drawing of multiple similar units is a must, so I've turned to hardware instancing but I'm a bit hesitant to continue to approach it the way I that I have.

I've set up a basic entity / component system to build around. The DrawComponents on creation register themselves with the DrawManager, which then sorts them via the model and texture they reference. When drawing time comes around it checks if there are the same number of instances as the last frame or if there is an instance buffer at all.

if there is not the same number of instances or there isnt a buffer then a new DynamicVertexBuffer is made and filled up (if there was a buffer it is disposed before doing this), if there was a suitable buffer then its data is simple regathered and set (for instances which may have moved or rotated or whatever).

It feels a bit dirty creating a new buffer whenever I need it resized but there doesn't seem to be a way to resize it. I'm not getting any performance issues I'm just wondering if there might be a preferred way to approach hardware instancing for objects which may or may not exist consistently?

Thanks in Advanced,
Bombshell


Path Finding Potential Field problem

01 December 2012 - 04:02 AM

I'm writing my own path finding, aimed at guiding crowds of enemies I've gone for a potential field approach.
while recalculating it keeps looping through already complete elements which is obviously not what I want.
I've tried various things but I can't figure out whats going on, its likely something stupid and small but I've tried many things to prevent this, now I'm even using a dictionary to log all the completed elements and checking if a key is in the dictionary before adding it, but despite my efforts its still adding completed elements!

U, D, L, R are the elements to the Up, Down, Left and Right respectively to the element in question, its worth noting that the field loops around the map, so with a map of 32x 32 elements, element[0,0].L will be element[31,0].
EDIT: position is the elements position within the array of elements making the field.

public void start()
  {
   value = 0;
   List list;
   List next;
   List A = new List();
   List B = new List();
   Dictionary done = new Dictionary();
   done.Add(this.position, this);
   if (U.value >= 0) A.Add(U);
   if (D.value >= 0) A.Add(D);
   if (L.value >= 0) A.Add(L);
   if (R.value >= 0) A.Add(R);
   list = A;
   next = B;
   int x = 1;
   while (list.Count > 0)
   {
	foreach (Potential i in list)
	{
	 done.Add(i.position, i);
	 i.recalc(x);
	 if (!done.ContainsKey(i.U.position) &amp;&amp; i.U.value >= 0) next.Add(i.U);
	 if (!done.ContainsKey(i.D.position) &amp;&amp; i.D.value >= 0) next.Add(i.D);
	 if (!done.ContainsKey(i.L.position) &amp;&amp; i.L.value >= 0) next.Add(i.L);
	 if (!done.ContainsKey(i.R.position) &amp;&amp; i.R.value >= 0) next.Add(i.R);
	}
	list.Clear();
	list = (list == A ? B : A);
	next = (next == A ? B : A);
	x++;
   }
  }

any and all help would be greatly appreciated,
thanks in advanced,
Bombshell

BIG EDIT: I'm a complete dumbo, completely missed that elements may be added to "next" multiple times, I was stuck on this for at least 2 hours and it was one of those silly little mistakes! so sorry to waste your time with this topic.

PARTNERS