• Content count

  • Joined

  • Last visited

Community Reputation

298 Neutral

About ginkgo

  • Rank
  1. Shadow Mapping Wobble

    That looks as if the time at which you render the frame doesn't coincide with the time you render the shadow map.   Did you render the shadow map in a previous frame or update the camera settings between sm and frame rendering?
  2. "roll" in a raycaster

    Might it be possible to do raycasting along vertical lines but then setting the resulting pixels along diagonal lines using Bresenham's algorithm?   The Bresenham pattern would be the same for all lines, so there should be no holes. It could also be stored and reused. The angle of the lines should be the inverse of the actual roll. You also need a height offset, otherwise you just get a skewed image. I guess there might be some artifacts due to the Aliasing in the Bresenham, but that should be acceptable.
  3. Emulating Render to Texture

    Z is the z coordinate of the vertex.
  4. Emulating Render to Texture

    If z = 0, then you have a problem with your perspective. A vertex is located on the eye-plane.   This division by zero problem can't happen in regular graphics pipelines because the depth-division happens after near-clipping.   I guess my first attempt for avoiding this problem would simply be adding a very small bias value to z if it's 0.
  5. Emulating Render to Texture

    You can do perspective correction yourself by interpolating iz=1/z and iuv=uv/z as varyings and reconstructing uv in the fragment shader as iuv/iz.
  6. We use a very simplified software rasterizer for voxelizing triangle-meshes. We set up an orthogonal projection of the model and rasterize it into a simple A-buffer data-structure. (A-buffers are framebuffers with per-pixel list of fragment depths) After we did that we can just sort those lists, move through them from front and back and set ranges between fragments to either inside or outside in the voxel volume.
  7. Short answer: You can't really. Representing a point-light requires an infinite number of coefficients. Long answer: You can approximate it using a circular shape. The paper "Algorithms for Spherical Harmonic-Lighting" By Ian G. Lisle and S.-L. Tracy Huang gives a method for calculating the coefficients for this directly.
  8. Modify pixel position

    This is something that is not usually done using shaders. Instead you just set the viewport to the split-screen area and update the scissor rectangle accordingly. (At least with OpenGL; I guess it's pretty much the same with D3D, though) Afterwards you just render using the standard shaders.
  9. Take a look at this: [url=""]http://www.iquilezle.../volumesort.htm[/url] This uses different indices for different view directions. This should work pretty well for your problem. You can reduce the amount of necessary index data by using instancing or something like that.
  10. Triangle rasterization

    Maybe this article might help you:
  11. Particles with DOF

    If you have a way to sort the particles from back to front you can use the particle depth to scale and blur them according to their circle of confusion and mix them all together with alpha blending. You could do the scaling of the bilboards in the geometry shader. The DoF particles wouldn't really interact with the DoF of the rest of the scene but this shouldn't be too apparent depending on how the particles are used.
  12. OpenGL Bloom Fail

    Those black artifacts are often indicative of NaN Pixels which get drawn as black on most GPUs. NaN mostly results from divisions by 0. One typical example being 0/0. Since any operation with a NaN value results in a NaN, you then get those black block artifacts when applying a filter. You can test this in GLSL at the end of you shader with this code snippet: [code]if (any(isnan(color))) { color = vec4(1,0,0,1); }[/code] This will set NaN pixels to red.
  13. My Shader Looks Lame (with pic)

    You can create a glossy highlight by clamping the phong-lobe. [url=""][/url] Image based lighting is even simpler and creates great results: [url=""][/url] This works as long as you don't change the viewing angle. You can also do reflection mapping with sphere maps: [url=""][/url]
  14. You should use lighting to calculate alpha. Just normal lambert lighting should do the trick: [code] light_dir = normalize(light_dir); normal = normalize(normal); float alpha = max(0, dot(light_dir, normal)); [/code] You will need to interpolate the normals for that.
  15. Texture a Sphere