Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 20 Aug 2011
Offline Last Active Feb 07 2015 06:13 AM

Topics I've Started

texture artifacts on voxel terrain

04 February 2015 - 03:40 AM

I am working on a voxel engine and I'm trying to texture my terrain but I get very strange artifacts. 

I lookup the texture coords depending on the worldposition, the relevant line of my shader code is

vec3 vtex =  mod(my_vWorldPosition.xzy,64.0)/64.0;

It seems whenever worldpos.xyz/64.0 is 0 the artifacts appear, if I offset the position by 0.0001 they are gone.


I think its because of a precisson loss or something, how can I overcome this?


Here is a pic how it looks like:



calculate position of plane in front of camera?

16 September 2014 - 03:49 PM

My cameras fov is 45, near is 0.1.


I created a transparent plane with


aspect = screenWidth / screenHeight

hNear = 2 * Math.tan( 45 * 2 * Math.PI / 360 / 2 ) * 0.101
wNear = hNear * aspect


and set it's z position to -0.101, the plane stays always in front of the camera.


How can I calculate at which Y position I have to set the underwaterplane that only the part below water appears to be blue?


I believe I have to get the point at which the water plane ( I know the y position ) intersects the near clipping plane of the camera?






how are this effects created?

13 April 2014 - 03:40 AM

I would like to know how this effects are created.




Are this all quadchains with different textures applied to it or how did they do it?

Its a screenshot from a towerdefense game in the Starcraft2 Engine.





reconstruct depth from z/w ?

19 February 2014 - 06:48 AM

Hello, I am fooling arround with webgl and I would like to add fog as a postprocess effect.

So I'm trying to get the original depth value to apply a linear fog.


I've already read through a lot of forum posts but I can't get it to work.


In my depthpass I output the depth as

float depth = gl_FragCoord.z / gl_FragCoord.w;
gl_FragColor = vec4( vec3(depth), 1.0 );

And in the postprocessing pass I try to reconstruct it using this method http://www.geeks3d.com/20091216/geexlab-how-to-visualize-the-depth-buffer-in-glsl/.


(my nearplane is 0.1 and farplane of the camera is 20000.0 )

float my_z = (-0.1 * 20000.0) / (depth1 - 20000.0);

Using my_z as depth my output isn't the same as when I just visualize the depth from my depthpass with:

float depth = gl_FragCoord.z / gl_FragCoord.w;
float color = 1.0 - smoothstep( 1.0, 200.0, depth );
gl_FragColor = vec4( vec3(color), 1.0 );

So I expected to get the same result when using the reconstructed Z in my postprocessing pass

float my_z = (-0.1 * 20000.0) / (texture2D( tDepth, texCoord ).x - 20000.0);
float color = 1.0 - smoothstep( 1.0, 200.0, depth );
gl_FragColor = vec4( vec3(color), 1.0 );

Outputting my_z gives me this result: http://i.imgur.com/8C3reNd.png


So what am I doing wrong here? 

combine 2 scenes?

01 February 2014 - 10:42 AM

I'm trying to implement deferred shading in webgl and render transparent objects in another scene / pass with forward rendering.


Whats bugging me is how can I combine the transparent scene with the output of the deferred renderer.

I thought of using the depth from the gbuffer in the forward renderer shader and discard every fragment which z > z from depth pass.


Would that work?