eppo

Members
  • Content count

    371
  • Joined

  • Last visited

Community Reputation

4877 Excellent

About eppo

  • Rank
    Member
  1. Hamster and the drone factory

    Thanks. It's a platforming/parkouring game with light strategy elements.
  2. In the game, transportation of goods is handled by little drones you can build/purchase at factories located across the map. You don't control these directly, so it'll require some pathfinding to get them around. Because the map is too large to cover with a (dense enough) regular grid, I generate a Poisson field with points packed with a minimal distance based on how close they lie to the underlying scene geometry, with points more densely distributed near the terrain surface. Next, the 1D ordering of sample points gets optimized by sorting them along a Hilbert-like curve, like this: This for two reasons: it improves tree construction times, as nearby points are likely to result in similar graphs when passed through an A* search (while reusing the ordering of points from previous searches). Secondly, if points end up with comparable ordering in whatever lookup table they're referred to in, then these tables can easily be compressed RLE-style. This allows me to store and traverse all n[sup]2[/sup] flow fields describing the fastest route from every point to every other point in the graph. Here I've set the drones to follow the second camera through the flow field: Bye!
  3. If you add a few additional vertices around each dashed line, you can scale them up or down individually by scaling UVs around the centroid of their four surrounding vertices. [attachment=35685:s.png]
  4. DevLog #16 - Keep on keep'n on!

    Oh those trees - so nice and marshmallowy.
  5. Renderer bells & whistles ~ pt. 2

    @Mak: I'd have to slap a displacement map on. The water actually already uses an animated normal map.
  6. Hello. Finally some dynamic indirect lighting in the renderer! [media][/media] (this uses a single ambient light source (an environment map) only) It's fairly lo-res, but it's good for things like "walking through a dark cave" or "a giant spaceship hanging overhead". On the cpu side, I look for a diffuse occlusion term by 'ray tracing' (they're pre-rasterized bit masks I fetch from a LUT) a simplified sphere-representation of the scene into a tetrahedral grid of cube maps / bit fields. During rendering, I then construct the occlusion term, for a specific vertex, based on its position and orientation in the grid and use it to interpolate between an indirect term grabbed in screen space and whatever light comes directly from the ambient light source itself.
  7. Edge smoothing

      I noticed that too. Most of those irregularities are because of specular aliasing/undersampling. (More rounded corners won't actually help in that regard.)     Interesting. It's a bit hard to read, but it deals exactly with what I'm trying to do. Thanks.   They create a normal vector that's a blend between two weighted smoothed/unsmoothed normals based on the distance to the nearest (hard) edge.
  8. Edge smoothing

    K. thanks. I should mention that many of these meshes are auto-generated in large quantities, so whatever I do has to work as a post-processor without having to manually touch up the geometry. The beveler can smooth anything you throw at it, so I'll use that method for now. 
  9. Hello.   I'm looking to smooth out lighting discontinuities around hard edges when drawing low resolution geometry:   I considered blurring normals in screen space (or somehow mark hard edges), but as that seemed both expensive (large kernel sizes etc.) and prone to rendering artifacts, I opted for an edge bevel algorithm that operates on the actual geometry. Essentially, it lays out an additional strip of detail around a hard edge to secure its unwelded normal orientation, then smooths the normals of vertices on the original edge:     This properly softens the edges, but it does result in an almost x2 increment of vertices needed. (It's actually more, as lights that render shadow maps can still use the unbeveled geometry since I don't displace any vertices on the bevel).   I think it's a fair tradeoff, but I still wonder if this can somehow be done without creating any additional geometry. It seems to boil down to knowing the distance to the nearest hard edge from any point (pixel) on a mesh and blend normals based on that. Though I fear that would either require a lot of connectivity information tied to the mesh, or a search in screen space.
  10. [D3D12] Ping Pong Rendering

      Setting one texture as a shader resource and the other as a render target, render, and swapping the two seems simpler to me than a compute approach. The API will guarantee proper synchronization.  
  11. That's the first row:   [0, 0, -1] 0 1 0 1 0 0
  12. "D3DXVECTOR3(matWorld._13, matWorld._23, matWorld._33);"   You're grabbing the third column from the matrix there, not the third row.   So the identity matrix transformed by a 1.57 rad rotation around the Y-axis results in: 0 0 [-1] 0 1 [0] 1 0 [0]