Jump to content

  • Log In with Google      Sign In   
  • Create Account

Journal of Aardvajk



Featured

Planet Engine - Frustrum Culling and Horizon Culling

Posted by , 23 August 2016 - - - - - - · 1,005 views

I decided to take a step back from the noise and terrain generation and move back towards the actual generation of the mesh at varying levels of detail. Something that needs work anyway and might hopefully make some of the other stuff make a bit more sense.

 

I discovered pretty quickly that just a very basic frustrum check works really well in this context, as the video hopefully shows. I'm toggling updating the camera position by pressing a key, so when it is switched off, the planet thinks the camera just stays were it was last, so you can look around and visualise what parts of the mesh are being skipped.

 

 

I am using the noise here just to provide a consistent shade to each of the triangles, just to make it a bit more interesting to look at. Actually working mainly in wire frame mode but that tends to make for rubbish looking videos so out of concern for you, dear reader, I switched to this display for the sake of this example :)

 

Otherwise, there is nothing particularly clever here. My frustrum class is made from various bits of code I've found scattered around the internet, with a special case method here for checking a triangle.

 

Frustrum::Frustrum(float depth, const Matrix &view, const Matrix &proj)
{
    Matrix localProj = proj;

    float zMin = -proj._43 / proj._33;
    float r = depth / (depth - zMin);

    localProj._33 = r;
    localProj._43 = -r * zMin;

    Matrix matrix = view * localProj;

    // near
    p[0].a = matrix._14 + matrix._13;
    p[0].b = matrix._24 + matrix._23;
    p[0].c = matrix._34 + matrix._33;
    p[0].d = matrix._44 + matrix._43;

    // far
    p[1].a = matrix._14 - matrix._13;
    p[1].b = matrix._24 - matrix._23;
    p[1].c = matrix._34 - matrix._33;
    p[1].d = matrix._44 - matrix._43;

    // left
    p[2].a = matrix._14 + matrix._11;
    p[2].b = matrix._24 + matrix._21;
    p[2].c = matrix._34 + matrix._31;
    p[2].d = matrix._44 + matrix._41;

    // right
    p[3].a = matrix._14 - matrix._11;
    p[3].b = matrix._24 - matrix._21;
    p[3].c = matrix._34 - matrix._31;
    p[3].d = matrix._44 - matrix._41;

    // top
    p[4].a = matrix._14 - matrix._12;
    p[4].b = matrix._24 - matrix._22;
    p[4].c = matrix._34 - matrix._32;
    p[4].d = matrix._44 - matrix._42;

    // bottom
    p[5].a = matrix._14 + matrix._12;
    p[5].b = matrix._24 + matrix._22;
    p[5].c = matrix._34 + matrix._32;
    p[5].d = matrix._44 + matrix._42;

    for(uint i = 0; i < 6; ++i)
    {
        D3DXPlaneNormalize(&p[i], &p[i]);
    }
}

bool Frustrum::contains(const Vec3 &a, const Vec3 &b, const Vec3 &c) const
{
    for(uint i = 0; i < 6; ++i)
    {
        const float e = -0.05f;

        if(D3DXPlaneDotCoord(&p[i], &a) < e && D3DXPlaneDotCoord(&p[i], &b) < e && D3DXPlaneDotCoord(&p[i], &c) < e)
        {
            return false;
        }
    }

    return true;
}
Then, in my recursive triangle add method in the planet builder, I just start the method with a check to see if the triangle is outside the frustrum and return if so.

 

Its nice because it will early out very quickly when building the icosahedron. The first twenty calls to the addRecursive method are for the twenty base icoshedron faces and in the majority of near-surface cases, most of these will be outside the frustrum so the entire face will be skipped without any subdivision required.

 

At the moment, because it has no concept of a horizon, the faces on both the near side and the far side are being generated, the far side technically falling inside the frustrum, so I guess I need to look at some kind of horizon culling next. I've seen this discussed and implemented on my travels around planet rendering papers and websites but have no idea yet how it works or how to implement it.

 

This silly little step forward has made me feel a lot more positive about getting the level of detail working now. Hopefully more interesting progress to follow.

 

[A little later]

 

Wow. Was reading this article about horizon culling and, as is usual for me, was getting completely lost with all the alien (to me) math notation. Then I focused on the following bit of English.

 

So there it is. To determine if the target point is behind the horizon plane, take the dot product of the vector from the viewer to the target with the vector from the viewer to the center of the ellipsoid. If that’s larger than the magnitude squared of the vector from the viewer to the center of the ellipsoid, minus one, then the target is behind the plane. No square roots or trigonometric functions required.

So I then ignored all of the rest of the page and all the notation, and just pasted that into my code as a comment, then translated it into code using my Maths wrapper methods:

 

/*
Take the dot product of the vector from the viewer to the target with the vector from the viewer to the center of the ellipsoid.
If that’s larger than the magnitude squared of the vector from the viewer to the center of the ellipsoid, minus one, then the target is behind the plane.
*/

bool beyondHorizon(const Vec3 &p, const Vec3 &camera, const Vec3 &center)
{
    Vec3 vt = p - camera;
    Vec3 vc = center - camera;

    float d = dotVectors(vt, vc);

    return d > vectorLengthSq(vc) - 1.0f;
}
Not feeling very optimistic, I then just added a very simple check against this for all three points of the triangle in my recursive method, returning doing nothing if all three are beyond the horizon according to this beasty.

 

And, to my amazement, it appears to work perfectly :) I'm rendering with culling disabled and wireframe at the moment so very easy to see if the backward faces are being rendered or not, and it seems they are not.

 

Excellent and unexpectedly simple result. We now have horizon culling appearing to work, subject to some more testing. Happy me!




No idea what I'm doing

Posted by , 21 August 2016 - - - - - - · 585 views

This entry will probably get edited as I go along this morning. No idea what the hell I'm doing here.

Attached Image

I modified my offline generator to output a range of colours instead of just heights.


QRgb color(GenerateType type, float value)
{
    if(type == GenerateType::Depth)
    {
        float v = 255.0f * value;
        return qRgb(v, v, v);
    }

    float v = value * 255.0f;

    if(v < 120.0f) return qRgb(0, 0, v);
    if(v < 130.0f) return qRgb(v, v, 0);
    if(v < 180.0f) return qRgb(0, v, 0);
    return qRgb(v, v, v);
}
This is now loaded as the cubemap, and when I generate the planet mesh, I'm using the same noise function to generate offsets for the heights to offset the vertices by, not that they seem to match or anything silly like that.

 

void addTriangle(const Vec3 &a, const Vec3 &b, const Vec3 &c, VertexBuffer &buffer, PlanetContext &ctx)
{
    float an = generateNoise(ctx.noise, normalizeVector(a), 256);
    float bn = generateNoise(ctx.noise, normalizeVector(b), 256);
    float cn = generateNoise(ctx.noise, normalizeVector(c), 256);

    float fac = 0.5f;

    Vec3 ao = (a * an) * fac;
    Vec3 bo = (b * bn) * fac;
    Vec3 co = (c * cn) * fac;

    Color color = makeColor(255, 255, 255);

    addVertex(a + ao, color, buffer);
    addVertex(b + bo, color, buffer);
    addVertex(c + co, color, buffer);
}
It seems because I am using Direct3D 9, I can't do texture fetch inside the vertex shader so I need to be doing the offsetting of the mesh on the CPU. In theory I should be getting the same values from the noise method when I call it in the C++ part of the code as if I get when generating the colour map, but I have some suspicious looking tsunami waves in the ocean I see so something not right.


Have a feeling I need to go back to the drawing board and start again here. I'm fumbling pretty blindly in the dark here.

Maybe I should try and get some 2D terrain working first with variable level of detail. I don't know. Its good fun trying to figure things out but I have to accept I don't seem to be moving towards my goal here. Will keep playing and update here if I make any progress. Must look pretty daft from the outside, but I like to post these. It provides a good opportunity to stop and take stock of what I have achieved (or failed to achieve).




Yet another Planet entry

Posted by , 20 August 2016 - - - - - - · 645 views

Okay, so following on from last time's success at creating a wrap-around seamless cube texture from the noise functions:

 

Attached Image

 

Here is the same texture just applied as a diffuse map to my planet at a fixed resolution:

 

Attached Image

 

And this is about as far as I have got.

 

I've been thinking more and more about generating the noise in the shader and believe it is possible in principle. The noise function I have is based upon float[256] and float[256][3] arrays which I could presumably replicate as a 256 x 1 texture, using the alpha as the permutation and the red/green/blue as the directional vectors, remembering to convert between 0-1 and 0-255 when required.

 

But I suspect it will be quite a lot of computation per pixel, especially when moving into turbulence and fractal brownian motion. I'm also a bit confused by how I then access the noise values on the CPU side. I'd need the heights for doing things like collision detection and walking on the terrain. Simply replicating the noise method on the CPU is likely to produce quite different values I suspect, due to floating point inaccuracy, so I'm a bit perplexed by how this works.

 

I think I need a much better overview of what I'm doing here. I'm sort of grasping in the dark at the moment. It's good in a way, since before I started I didn't even know what I didn't know, if you see what I mean, whereas now I am gradually discovering a list of very specific problems I am facing.

 

Guess that is all for now. Don't have a lot of development time on a Saturday morning and need to get on.ve it is possible in principle.




Planet Generation - continued

Posted by , 17 August 2016 - - - - - - · 843 views

So I've been making some slow progress with my planet rendering. Hit a bit of a brick wall at the moment so thought it was time to post a quick update on where I have gotten so far.

 

I was playing around with different noise generation methods and ended up basing one on Sean O'Neil's Sandbox project. Works very nicely.

 

Attached Image

 

I cleaned up and refactored the code into my own style and format. Actually fairly compact in the end.

 


class Noise
{
public:
    explicit Noise(unsigned int seed);

    float noise(const Vec3 &v) const;

protected:
    float lattice(int ix, float fx, int iy=0, float fy=0, int iz=0, float fz=0, int iw=0, float fw=0) const;

    unsigned char perm[256];
    float buffer[256][3];
};

Noise::Noise(unsigned int seed)
{
    std::srand(seed);

    for(int i = 0; i < 256; ++i)
    {
        perm[i] = i;
        for(int j = 0; j < 3; ++j)
        {
            buffer[i][j] = (float)random(-0.5, 0.5);
        }

        float magnitude = 0;
        for(int j = 0; j < 3; ++j)
        {
            magnitude += buffer[i][j] * buffer[i][j];
        }

        magnitude = 1 / sqrtf(magnitude);
        for(int j = 0; j < 3; ++j)
        {
            buffer[i][j] *= magnitude;
        }
    }

    for(int i = 0; i < 256; ++i)
    {
        int j = static_cast<int>(random(0, 255));
        std::swap(perm[i], perm[j]);
    }
}

float Noise::noise(const Vec3 &v) const
{
    int n[3];
    float r[3];
    float w[3];

    for(int i = 0; i < 3; ++i)
    {
        n[i] = floor(v[i]);
        r[i] = v[i] - n[i];
        w[i] = cubic(r[i]);
    }

    float value = lerp(lerp(lerp(lattice(n[0], r[0], n[1], r[1], n[2], r[2]), lattice(n[0]+1, r[0]-1, n[1], r[1], n[2], r[2]), w[0]),
                       lerp(lattice(n[0], r[0], n[1]+1, r[1]-1, n[2], r[2]), lattice(n[0]+1, r[0]-1, n[1]+1, r[1]-1, n[2], r[2]), w[0]),
                       w[1]),
                  lerp(lerp(lattice(n[0], r[0], n[1], r[1], n[2]+1, r[2]-1), lattice(n[0]+1, r[0]-1, n[1], r[1], n[2]+1, r[2]-1), w[0]),
                       lerp(lattice(n[0], r[0], n[1]+1, r[1]-1, n[2]+1, r[2]-1), lattice(n[0]+1, r[0]-1, n[1]+1, r[1]-1, n[2]+1, r[2]-1), w[0]),
                       w[1]),
                  w[2]);

    return clamp(value * 2.0f, -0.99999f, 0.99999f);
}

float Noise::lattice(int ix, float fx, int iy, float fy, int iz, float fz, int iw, float fw) const
{
    int n[4] = { ix, iy, iz, iw };
    float f[4] = { fx, fy, fz, fw };

    int index = 0;
    for(int i = 0; i < 3; ++i)
    {
        index = perm[(index + n[i]) & 0xFF];
    }

    float value = 0;
    for(int i = 0; i < 3; ++i)
    {
        value += buffer[index][i] * f[i];
    }

    return value;
}
Next I moved this into my planet program and used it to set a colour on each of the vertices of the generated sphere.

 

Attached Image

 

The problem I have now is that the resolution is limited to the resolution of the sphere. I can't generate the noise inside of a shader, so the only way I can think to move forward now is to use the noise method offline to generate a cubemap and then sample this with the interpolated directional vectors in the pixel shader to spread the noise smoothly across the planet.

 

Maybe this is how this stuff works, but this means I have to generate the entire planet into a cube map at a fixed maximum resolution. I was hoping there was some way I could sample the noise method as I went along without having to pre-render the results.

 

Am I missing something obvious here? If I generate a cubemap with 1024x1024 textures say, it imposes quite a strict maximum resolution on my planet surface. How else can I sample the noise method given a directional vector inside the pixel shader?

 

Stuck. Any tips appreciated.


[EDIT] So I started trying to generate some offline cube maps and hit another brick wall there trying to get them to seamlessly tile.

Maybe if you have time, you could pop by my Plea For Help Thread in Graphics Programming and Theory and offer me some expertise? Thanks :)

 

[EDIT] Woo hoo, all sorted, was mapping the vectors to the pixels wrong was all:

 

Attached Image

 

That's the basic noise with turbulence applied, wrapping seamlessly around a cube. Next up, importing this into the planet renderer and sampling it in the shader :)




Perlin Noise Day (week)

Posted by , 13 August 2016 - - - - - - · 348 views

Just a quick entry today. Running out of morning time.

Decided I needed to take a break from level of detail on a sphere stuff so have decleared today to be Perlin Noise Day. Will need to be able to generate some kind of perlin noise based cubemap or something for height maps for my planetoids and not something I have ever done before, so need to start from the basics here to get to where I need to be.

If I can get to the point where I can plant a random seed and use it to generate a seamless cube map of perlin noise, I know how I can then use a shader to translate patches of a subdivided sphere geometry to make the plant geometry. I'm going to start by investigating normal 2D perlin noise until I'm happy I understand each stage of this.

Saturday morning schedule only allows for an hour in the morning for fun stuff (assuming no work stuff comes in, which it hasn't this morning, yay! :)) so I've just been throwing together a very quick Qt application to host an image of a given dimension, assigning a random value between 0 and 1 to each pixel:

Attached Image

At the moment the value is literally just a random, but it can be of any dimension and the view, based on a QGraphicsView, scrolls and shows a background and so on, so will work nicely as a host application to test out various implementations of perlin noise generators.

What I am hoping is that a single random seed number will allow me to generate noise at different levels of scale so that this can be used to effectively encode an entire planet geometry to any level of detail in a single starting seed. I think this is how this stuff works. Would be great to be able to take a single number and have that effectively encode an entire planet's map geometry to any level of scale.

Maybe I have it all wrong. Who knows. I like figuring this stuff out all by myself from basics. I'm enjoying the process rather than the end result.

Sorry not the most exciting entry of all time. Just nice to have these to look back on when I do achieve something cool. What I'm really hoping is that once I start getting somewhere with the planet engine, I'll be able to mix and match it up with all the work I've done over the last few years on the conventional collision detection and interaction stuff for normal capsule controller vs convex shapes to be able to put prefabricated buildings and structures on the surface of a perlin noise based planet. Don't want to throw all that stuff away, juat want to be able to host it inside huge procedurally generated worlds that I can fly between.

All for now. Thanks for stopping by.

 

(Oh, and to the staff, your strategy worked and I have now whitelisted GameDev on the ad blocker = perfectly reasonable request and hope I'm generating some useful revenue for you now :))








August 2016 »

S M T W T F S
 123456
78910111213
14151617181920
21222324252627
2829 30 31   


PARTNERS