Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


jeremie009

Member Since 21 Sep 2012
Offline Last Active Dec 23 2014 08:22 AM

Topics I've Started

single non pinhole camera

14 December 2012 - 03:50 AM

I'm trying to figure out something about this paper, http://www.cs.purdue.edu/cgvlab/papers/popescu/popescuNPI_CGA11.pdf.
I'm not really sure about how I would implement it in my own game engine.
So from what I understood to create a single non pinhole occlusion camera I need to project the image along different ray based on the depth value ? Or do I need to distort the vertex projection so I can see occluded part?
Also I'm not sure but, can use something similar to a fisheye camera ?

Solved reconstruct depth in orthographic view

27 September 2012 - 12:30 AM

I've been trying to get my world space position of my pixel but I'm missing something.
I'm using a orthographic view for a 2.5d game. My depth is linear and this is my code.

// Pixel shader
	 float3 lightPos  = lightPosition;// light World position use to draw the sphere volume
	 float2 texCoord = PostProjToScreen(PSIn.lightPosition)+halfPixel;// use to project the normal and depth texture.
  
	 float depth = tex2D(depthMap, texCoord);
  
	 float4 position;
  
	 position.x = texCoord.x *2-1;
	 position.y = (1-texCoord.y)*2-1;
	 position.z = depth.r;
	 position.w = 1;
	 position = mul(position, inViewProjection);
	 //position.xyz/=position.w; // I comment it but even without it it doesn't work
  
  
	 float4 normal = (tex2D(normalMap, texCoord)-.5f) * 2;
	 normal = normalize(normal);
  
	 float3 lightDirection = normalize(lightPos-position);
	 float att = saturate(1.0f - length(lightDirection) /attenuation);
	 float lightning = saturate (dot(normal, lightDirection));

	 return float4(lightColor* lightning*att, 1);



I'm using a sphere but it's not working the way I want. I reproject the textures properly onto the sphere but the light coordinates in the pixel shader seems to be stuck at zero even if when I move the camera light volume update accordingly. I'm using an orthographic view ( non off centered) and xna.

HDR Bloom with RGBM encoding format

21 September 2012 - 11:31 PM

I've been trying to achieve bloom using a 8bit render target using rgbm but after I failed several times I've been wondering if it was even possible.

If i'm not mistaken the alpha contain the data about the intensity so when I doing the offset to do my Gaussian blur am I messing up the encode information ?

here's the code I'm using to encode and decode my image

float MaxRange = 6;
float3 DecodeRGBM(float4 rgbm)
{
return rgbm.rgb * (rgbm.a * MaxRange);
}

float4 EncodeRGBM(float3 rgb)
{
float maxRGB = max(rgb.x,max(rgb.g,rgb.b));
float M = maxRGB / MaxRange;
M = ceil(M * 255.0) / 255.0;
return float4(rgb / (M * MaxRange), M);
}


I'm able to encode and decode any image but when I try to blur the image doesn't look right. Highly saturated and lot of crazy color.


And here's my blur function
( taken from matt pettineo)

/ Performs a gaussian blue in one direction
float4 Blur(float2 texcoord, float2 texScale, float sigma)
{
float4 color = 0;
for (int i = -6; i < 6; i++)
{
float weight = CalcGaussianWeight(i, sigma);
float2 texCoord = texcoord;
texCoord += (i / size) * texScale;
float4 sample = tex2D(BloomMap, texCoord);
sample.rgb = DecodeRGBM(sample);
sample.a = 1;
sample*=sample;
color += sample * weight;
}
return color;
}

I also I'm using xna and I'm able to blur the image using hdrblendable but Id prefer to use an 8 bit image so I can use a filter on it. ( xna doesn't support filter anymore for many format).

PARTNERS