coffeecup

Members
  • Content count

    19
  • Joined

  • Last visited

Community Reputation

143 Neutral

About coffeecup

  • Rank
    Member
  1. I'm trying to generate seamless 3d textures and found this and this blog entry fromJTippetts, where in the first he is using 4d noise to generate a tileable 2d texture and in the second entry he posted a mapping function which requires a 6d noise.   Does it really only work with 6d noise, because I hardly can find any implementation of 6d noise ?
  2. texture artifacts on voxel terrain

    vertexshader   attribute vec3 position; uniform sampler2D texture_0; varying vec3  my_vWorldPosition; void main() {   vec4 mvPosition = modelViewMatrix * vec4( position, 1.0 );   gl_Position = projectionMatrix * mvPosition;   my_vWorldPosition = (modelMatrix * vec4( position, 1.0 )).xyz;     } fragmentshader   uniform sampler2D texture_0; varying vec3 my_vWorldPosition; vec2 computeSliceOffset(float slice, float slicesPerRow, vec2 sliceSize) {   return sliceSize * vec2(mod(slice, slicesPerRow),floor(slice / slicesPerRow));                   }                                                                         vec4 sampleAs3DTexture(sampler2D tex, vec3 texCoord, float size, float numRows, float slicesPerRow) {   float slice   = texCoord.z * size;                                                float sliceZ  = floor(slice);                                    float zOffset = fract(slice);                                                                                                               vec2 sliceSize = vec2(1.0 / slicesPerRow,             // u space of 1 slice                             1.0 / numRows);                 // v space of 1 slice                                                                                         vec2 slice0Offset = computeSliceOffset(sliceZ, slicesPerRow, sliceSize);          vec2 slice1Offset = computeSliceOffset(sliceZ + 1.0, slicesPerRow, sliceSize);                                                                                      vec2 slicePixelSize = sliceSize / size;               // space of 1 pixel         vec2 sliceInnerSize = slicePixelSize * (size - 1.0);  // space of size pixels                                                                                       vec2 uv = slicePixelSize * 0.5 + texCoord.xy * sliceInnerSize;                    vec4 slice0Color = texture2D(tex, slice0Offset + uv);                             vec4 slice1Color = texture2D(tex, slice1Offset + uv);                             //return mix(slice0Color, slice1Color, zOffset);                                  return slice0Color;                                                             }                                                                              void main() {                                                                       vec3 vtex =  mod(my_vWorldPosition.xzy,64.0)/64.0;   gl_FragColor = sampleAs3DTexture(texture_0,vtex,64.0,8.0,8.0); }
  3. I am working on a voxel engine and I'm trying to texture my terrain but I get very strange artifacts.  I lookup the texture coords depending on the worldposition, the relevant line of my shader code is vec3 vtex = mod(my_vWorldPosition.xzy,64.0)/64.0; It seems whenever worldpos.xyz/64.0 is 0 the artifacts appear, if I offset the position by 0.0001 they are gone.   I think its because of a precisson loss or something, how can I overcome this?   Here is a pic how it looks like:  
  4. calculate position of plane in front of camera?

      So just to be sure I should get this value which I can plugin as Y position for the transparent plane? I think I'm missing something.
  5. calculate position of plane in front of camera?

    for one moment I thought it worked but I'm getting strange values, but I think i get what you are suggesting and it should work, basically with the view-projection matrix I can project my waterplane where I know Y from world space to clip space and then set the y value of the transparent plane so that it aligns with the waterplane..   Do I have to consider the z position of my transparent plane here ?   btw. is clip space the same as camera space ?
  6. My cameras fov is 45, near is 0.1.   I created a transparent plane with   aspect = screenWidth / screenHeight hNear = 2 * Math.tan( 45 * 2 * Math.PI / 360 / 2 ) * 0.101 wNear = hNear * aspect   and set it's z position to -0.101, the plane stays always in front of the camera.   How can I calculate at which Y position I have to set the underwaterplane that only the part below water appears to be blue?   I believe I have to get the point at which the water plane ( I know the y position ) intersects the near clipping plane of the camera?        
  7. how are this effects created?

    I took the screenshot from the video here : http://www.youtube.com/watch?v=MDi9hjTfRPA#t=1047   Do you think that the blue shot which is comming from the right side is also a particle system?   I read something about creating laser beams here http://codepoke.net/2011/12/27/opengl-libgdx-laser-fx/ and I was wondering if I could use the some technique for other effects like in the youtube video.
  8. I would like to know how this effects are created.     Are this all quadchains with different textures applied to it or how did they do it? Its a screenshot from a towerdefense game in the Starcraft2 Engine.        
  9. reconstruct depth from z/w ?

    ofc, just after posting this I found another error, now I just output z/w and use it in the other shader..
  10. Hello, I am fooling arround with webgl and I would like to add fog as a postprocess effect. So I'm trying to get the original depth value to apply a linear fog.   I've already read through a lot of forum posts but I can't get it to work.   In my depthpass I output the depth as float depth = gl_FragCoord.z / gl_FragCoord.w; gl_FragColor = vec4( vec3(depth), 1.0 ); And in the postprocessing pass I try to reconstruct it using this method http://www.geeks3d.com/20091216/geexlab-how-to-visualize-the-depth-buffer-in-glsl/.   (my nearplane is 0.1 and farplane of the camera is 20000.0 ) float my_z = (-0.1 * 20000.0) / (depth1 - 20000.0); Using my_z as depth my output isn't the same as when I just visualize the depth from my depthpass with: float depth = gl_FragCoord.z / gl_FragCoord.w; float color = 1.0 - smoothstep( 1.0, 200.0, depth ); gl_FragColor = vec4( vec3(color), 1.0 ); So I expected to get the same result when using the reconstructed Z in my postprocessing pass float my_z = (-0.1 * 20000.0) / (texture2D( tDepth, texCoord ).x - 20000.0); float color = 1.0 - smoothstep( 1.0, 200.0, depth ); gl_FragColor = vec4( vec3(color), 1.0 ); Outputting my_z gives me this result: http://i.imgur.com/8C3reNd.png   So what am I doing wrong here? 
  11. combine 2 scenes?

    I'm trying to implement deferred shading in webgl and render transparent objects in another scene / pass with forward rendering.   Whats bugging me is how can I combine the transparent scene with the output of the deferred renderer. I thought of using the depth from the gbuffer in the forward renderer shader and discard every fragment which z > z from depth pass.   Would that work?    
  12. get colliding face of 2 AABB's?

    thanks, indeed that helped, i got a little confused reading so much stuff about SAT.   now i calculate the distance between the 2 boxes using the min and max values for each axis and take the largest distance which gives me the correct normal.   the plural of axis is axes.
  13. get colliding face of 2 AABB's?

    So far I have 2 AABB's, one stationary and one moving, checking against min(x,y,z) and max(x,y,z) of the 2 AABB's I can tell if they intersect.   Now I would like to implement a "slide against the wall" effect, therefore I need to get the normal of the colliding face, but I have troubles understanding how I can compute which face is colliding.   I googled a lot and found something about SAT (  http://www.codezealot.org/archives/55 ) but I am still confused, is this the only technique to get the colliding face or are there other options available?     I would appreciate some hints or pseudo code to get me started.  
  14. question about clientside prediction

    thanks for your inputs and thanks for moving to this forum. I added interpolation as mentioned in the faqs to lerp the clients position to the corrected position which makes it much smoother, but it's not completly deterministic yet since I just apply the inputs as they arrive. It seems what I am missing is timestamping the inputs to process them for the same duration, I discarded this at first because I wasn't sure how to handle things if an input arrives in the past and how to set the clienttime correctly, at the moment I set my current client time to the server time which I get from the latest snapshot and substract RTT/2 to match the server time, but I guess that I have to set the client time to latest snapshot time + some average size of the last 10 RTTs sampled instead to make sure the inputs get to the server just before they are needed.
  15. i wrote an authoritative server / client model and read through a few tutorials about clientside prediction. most of them assume that if the client presses a key it sends a messages like "move right one point". when a server update arrives, the client sets the local player position back to the snapshot and replays all inputs which aren't acknowleged by the server yet. i have got client side prediction working with this approach, but i would like to set my velocity from input and calculate the position over time, so when i press "W" my client starts moving forward until i release "W", but this way my client always gets out of sync because it takes some time to get this message to the server, so for example if i press W for about 150ms, the server simulates 153ms and on the next snapshot it causes stuttering. what's troubling me here is that in many tutorials i read "if the client/server use shared code for their entities, prediction errors will never really occure", but in my case they always occure slightly. i tried timestamping the inputs "in the future" and appling them for the same duration which the client tells the server but this doesn't feel right since it should be an authoritative server/client model. how can i overcome this?