Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


george7378

Member Since 24 Jan 2013
Offline Last Active Apr 03 2015 04:19 AM

#5196920 Annoying shadow map artifacts

Posted by george7378 on 08 December 2014 - 04:13 AM

Ah, OK, never mind - it turns out I was clearing my shadow map to D3DCOLOR_XRGB(1, 1, 1) rather than D3DCOLOR_XRGB(255, 255, 255), meaning that the whole thing was basically black.




#5196850 Annoying shadow map artifacts

Posted by george7378 on 07 December 2014 - 03:06 PM

OK, so changing my sampler to:

 

AddressU = Border; 
AddressV = Border;
BorderColor = float4(1111);

 

...and removing the conditional has solved the problem. Thanks! So just to make sure I know what's going on here: the tex2D(ShadowMapSampler, projTexCoord).r is returning maximum depth (1) for any pixel outside the shadow map, and hence PSIn.Pos2DLight.- f_ShadowBias will always be less than the depth stored in the shadow map?

 

Thanks again :)




#5189241 Spherified cube

Posted by george7378 on 26 October 2014 - 10:56 AM

Thankyou for the replies. I tried it and found that it is indeed incredibly simple to do assuming that the cube's model space origin is right at its centre, as kauna said.

 

swiftcoder: at the moment I'm just experimenting and this was just a passing thought. I am eventually hoping to apply something like this to planets but I will most likely scale the planet itself down and just render it closer so that I can still use a small mesh.




#5189047 Scenes with large and small elements

Posted by george7378 on 25 October 2014 - 02:48 AM

Ok, thanks for all the replies! I have solved it for the time being by calculating the actual positions using metres, and when I render I put the camera at (0, 0, 0) and render everything relative to that. For the Earth I'm using a sphere of radius 1 which I render 6378100 times closer than the actual calculated distance. Disabling the z-buffer for the planet rendering makes sure it's in the background.


#5158245 Recalculating terrain normals

Posted by george7378 on 04 June 2014 - 05:12 PM

Sorry for not replying and thanks for the ideas everyone, since my terrain is static and isn't changing per-frame, I think the best idea may be to create another texture-based container to store the normals which can then be looked up in the shader. I'll see how it goes :)




#5151833 Updated ray tracer

Posted by george7378 on 06 May 2014 - 10:34 AM

I've been working on my ray racer recently, and I've changed it in a few ways:

 

- It now supports multicore rendering

- It can parse external text files and import scenes

- It can handle soft shadows

- It can handle textures

 

Here are a few pics I though I'd share:

 

http://imgur.com/a/sDGaz

 

To do: It would be nice to add the ability to render via path tracing so that I can add caustics, etc... but I haven't really found a good explanation of exactly how to do this. I understand that rather than spawning recursive reflection/refraction rays at each hitpoint, you spawn a single probabilstic ray, but I'm still a bit hazy on how to actually determine the final pixel color.

 

Thanks for looking smile.png




#5137540 Avoiding dark areas in ray tracing

Posted by george7378 on 09 March 2014 - 09:20 AM

Also, another question: when I multiply colour together, for example when I multiply the colour of a reflection by the colour of the surface it reflects off, is it standard to normalise the colour vectors?




#5137445 Avoiding dark areas in ray tracing

Posted by george7378 on 08 March 2014 - 06:20 PM

Here's how it looks after the new algorithm:

 

fin_1109_large_adaptive.png

 

Still not sure if it's right, but there's some more light in there! The spheres don't reflect any ambient light by the way, just specular.




#5137433 Avoiding dark areas in ray tracing

Posted by george7378 on 08 March 2014 - 04:46 PM

OK, so you mean something like this:

 

trace(float attenuation)
{
reflectionColour = refractionColour = (0, 0, 0);
if (intersectedShape.reflectivity*attenuation > cutoff){reflectionColour = trace(intersectedShape.reflectivity*attenuation);}
if (intersectedShape.transparency*attenuation > cutoff){refractionColour= trace(intersectedShape.transparency*attenuation);}
}

 

i.e. I'd start my initial rays with an attenuation value of 1 and then every time the recursive rays hit a new object, if the attenuation value of the spawned ray would be less than the cutoff, I wouldn't spawn a new ray? I think that makes sense! It could also lead to better efficiency.




#5137321 Avoiding dark areas in ray tracing

Posted by george7378 on 08 March 2014 - 08:48 AM

Hi everyone,

 

I'm getting some nice pictures from my ray tracer but I've noticed a problem with some geometry setups. For example, this pile of spheres:

 

pyramid_1.3h.png

 

Between the lower and the middle level, the inner areas are completely dark. I'm not sure if this is realistic or not (I don't have any spheres to test it with) but it doesn't look right. I think I know why it's happening - my maximum ray depth is 5, and the rays probably get stuck in the cavities and never collect any colour. Is this realistic, and can I fix it without using infinite ray depth?

 

Thanks!




#5136257 Ray tracer minor issue

Posted by george7378 on 04 March 2014 - 02:02 AM

Thanks very much for the replies, Bacterius was correct - my bias was too small. I don't know why I didn't think of that, it's sitting there right underneath my ray depth constant, in big capital letters!

 

Krypt0n - I also enabled floating point exceptions (at least I think I did!) and ran my program in debug in VS2012 and nothing showed up. Thanks for the suggestion :)




#5132003 Making a texture render to itself

Posted by george7378 on 17 February 2014 - 07:24 AM

I've changed it so that I do the ping-pong rendering between two different textures. It's very little extra effort but it means that I can be guaranteed safe results (and I don't get the warning signs any more). When I render properly I get exactly the same result with the Gaussian blur as I I do with the single texture, so the GPU must be making a separate copy. Thanks for the help!




#5130929 Testing some shaders

Posted by george7378 on 12 February 2014 - 06:23 PM

I decided to try my hand at creating a program which focusses on rendering individual images rather than realtime interactive scenes, and here's my first effort:

 

rendering.png

 

I am sort of cheating with the shadow maps because rather than rendering them along with the rest of the scene, I create them separately, manually blur them and then import them into the scene which is rendered from the same perspective as the one for which they were generated. Saves a lot of messing around with post-processing!

 

I'd say the most complicated part of getting it to work was projecting the shadow maps into the cube map used for the reflections.

 

Anyway, thought I'd share :)

 




#5127043 normalize() producing different results

Posted by george7378 on 28 January 2014 - 03:27 PM

I've come across something weird in HLSL. Well, maybe it's not that weird, but it was unexpected for me. There's probably a simple explanation. Anyway...

 

when I do this in my normal mapping shader:

 

float3 normal = normalize(2.0f*tex2D(TextureSamplerN, PSIn.TexCoords) - 1.0f);

 

...I get a different result to if I do this:

 

float3 normal = 2.0f*tex2D(TextureSamplerN, PSIn.TexCoords) - 1.0f;
normal = normalize(normal);

 

It seems like the latter one produces the correct result when I run my program, but I have no idea why they're different. Any ideas?

 

Thanks!
 




#5126356 Experimenting with point lighting

Posted by george7378 on 25 January 2014 - 01:31 PM

I've been advancing my engine to handle point lighting, and I combined this with normal mapping and my water engine to create a few nice shots which I'm gonna share:

Attached Thumbnails

  • wat1.png
  • wat2.png
  • wat3.png
  • wat4.png





PARTNERS