• entries
232
1463
• views
961582

887 views

# 2008 Retrospective

First of all, Happy new year 2009 !

Looking back at 2008, I can't say I'm particularly happy about how things went. There has been some serious delays on what I intended to achieve, and it's not due to a single reason, but more to various causes that accumulated and became critical in 2008.

First of all, back in early 2008 I had an opportunity to sell a license of the engine. Unfortunately, it wasn't ready at that time ( and still isn't today ), and it lacked several features, and more importantly, documentation and tutorial. So I spent two good months to reorganize code, clean some modules, comment the interfaces of the code, and start some documentation and tutorials. That's not "wasted work" of course, but those 2 months weren't directly useful to Infinity.

At the same time, I also decided that it was time to revamp the website and make one that looked more professional and more complete. After studying all the solutions and getting advice from various people, we went for Joomla and two people were in charge of implementing the website, one for setting up Joomla and customizing it to our needs, and one for the layout / design. Long story short, things were delayed and people went busy IRL, and progress stopped. In the end it wasn't until inoX and I put our hands into the dirty work that things started to move at a decent rate. All in all, I would say that I spent more or less 3 months on it.

When the new website launched, the web server collapsed: Joomla was consuming a lot more RAM than on the old website, and visitors started to get blank pages. In emergency I rented a dedicated server and started to move files and databases to the new server. From the crisis to its resolution, I would say I spent 2 weeks on server issues. At this point of time, we're already in September 2008.

Next comes management issues ( that in fact started prior to the server move ) within the writing team. Solving them caused a bit of drama but finally Vileedge was appointed as our new lead storywriter. Reorganization of the writing team was also needed, and all the docs had to be collected and installed on the new wiki. Since I wanted the wiki to have a special organization, I had to do it myself, and it took more or less 3 additional weeks.

It wasn't until October 2008 that I resumed "serious" work on the game.

Hopefully there won't be as many distractions in 2009 and progress will go much faster this year.

# GPU Terrain Rendering

Since the last dev journal, I continued to work on generating and rendering terrain on the GPU.

Normal mapping is now fully functional, and I'm quite happy with the results. For each terrain node, a temporary heightmap is generated ( 256^2 for the default quality or 512^2 for the high quality ) and gradiants are computed to extract normals. Two pixels are reserved for the left / right / top / bottom boundaries, so that seams don't appear between adjacent nodes.

The next step was to reimplement ( and improve ) diffuse textures. Procedural texturing is still in use as before, with a 256^2 lookup-table texture ( inputs are slope and altitude ), but instead of giving the ID of the texture layer to use, this time it directly gives a blending coefficient for each layer. With a maximum of 16 layers per planet, and one layer consuming one channel, it may appear that 4 RGBA lookup textures are needed, but I simply packed them all together in a 1024x256 texture and sampled it 4 times in the diffuse texturing shader.

As a result, it is now possible to blend any number of layers per texel, while the previous method only allowed one.

There are other interesting benefits to this technique: first, aliasing is less pronounced since everything is stored in textures instead of being computed per pixel in the final shader. Terrain transitions are less sharp and flicker less in the distance.

The second important benefit is that the final rendering shader is a lot lighter now; as the diffuse texture is only generated once per node, there's no particular work to do per frame, other than sampling the diffuse texture. Previously, for each frame and each pixel, the whole procedural texturing had to be recomputed. It's a bit like if the textures were caching the results between frames. Of course, a lighter shader means a higher framerate; I don't have hard numbers, but I would say the framerate easily doubled. It's not rate for my 8800 GTX to achieve 80-120 fps at ground level, and 150-300 fps in space.

There a big drawback to the new texturing algorithm: storing a normal map and a diffuse texture of 256^2 for each terrain node consumes video memory. A lot of it. In fact, at ground level there can be up to 1000 nodes in memory. If you do the calculations, you'll find that planet textures can quickly fill the whole memory of a 512 MB card, and that's without ships/buildings/other models. That was clearly unacceptable, so I started to look at ways to reduce the resolution of textures when they're far away, or seen at a low angle. In practise, it works a bit like mipmapping, but manually updates the resolution of nodes as the camera moves. More textures have to updated in real-time, but it's worth it, and a planet at ground level with 256^2 textures now consumes around 150 MB of video memory.

I still have a few optimizations in mind; one of them is to store normal maps in two channels instead of three ( note that using three channels has the same memory cost than four ) and recomputing the missing channel in a shader at runtime. This will save 50% of the normal map memory, or 25% of the total video memory for planets, and the visual quality shouldn't be affected noticeably.

Precisions issues at still here. For this reason I will start to move back the geometry generation to the CPU. On the GPU, there are little gaps between terrain nodes due to the limitation of 32-bits precision in floating point calculations. GPUs only support fp32, while CPUs do support fp64, and code is already here on the CPU for fp64, but I have to clean and rewrite some interfaces to make this code work again. Once done, the gaps should disappear between nodes. Normal mapping will stay on the GPU though, as there's no way it can be done fast enough for 256^2 maps ( or higher ) on the CPU.

Finally, I still have some work to do to minimize the popping and geomorphing bugs.

# Revisiting atmospheric scattering

I'm honnestly fed up with atmospheric scattering. I've implemented it two or three times already, but there has always been some problems. Maybe I'm too perfectionist, maybe not. Of course, when I post screenshots I tend to pick the ones that look the best, so while it may appear to you that atmospheric scattering was already good before, in reality.. it still had some annoying problems.

The before-last scattering algorithm suffered from two main problems: sunsets colors weren't vivid enough ( plus there was some unnatural bands of darkness in the sky ), and haze in distant mountains wasn't blue-ish enough.

In november I reworked the theory of atmospheric scattering from scratch, to make sure I was correctly understanding all the formulas and algorithms. The base paper I used is the famous "Display of The Earth Taking into Account Atmospheric Scattering (1993)" by Nishita & All. I first blinded implemented the formulas from the paper in a shader, and results were a bit disapointing. I now had vivid sunsets, but over-saturation on the atmosphere glow, and the blue-ish haze wasn't there either.

As a side note, I wasted almost 2 weeks thanks to ATI drivers. I've hit a high number of driver bugs in the shaders, that made me go mad. As of today, even the latest catalyst 8.12 still have the bugs, but at least I've rewritten the shaders to contourn them via all sorts of nasty tricks you definitely don't want to hear of.

This week end, I decided to have a go at re-implementing the algorithm, but step-by-step, checking every result and making sure it looked good, adjusting parameters in the process. It has given much nicer results so far, as I finally have good sunsets, a good blueish haze and no saturation to white on the atmosphere glow from space.

As usual, random pictures:

Wow, your engine keeps on getting better and beter. Keep up the good work and cant wait too see the day that version 1.0 is ready and out to the public. :)

Looks great! I think those atmosphere tweaks really look good - I just spent roughly four days trying to understand Sean O'neils atmosphere shaders and make them work and finally just faked some for myself and they are not anywhere as accurate as yours.

When you say generating normals maps, does that mean you are combining terrain normal maps with regular texture normal maps? Do you still calculate your inverse texture space transform matrices on the cpu or on the gpu (for getting the light into texture space for your normal maps)?

You can see pictures at http://alexcpeterson.com

Now how do your algorithms handle city glow? ( http://farm4.static.flickr.com/3168/2620924929_dc72729038.jpg )

Really nice.
Quote:
 I'm honnestly fed up with atmospheric scattering.
Here I laughed how true it is form me too. And my dissatisfaction comes too from the sunset colors, dark sky and ATI drivers. And right now I'm optimizing the engine because it consumes too much GPU memory [grin]

Amazing work!

Quote:
 Original post by petrocket Looks great! I think those atmosphere tweaks really look good - I just spent roughly four days trying to understand Sean O'neils atmosphere shaders and make them work and finally just faked some for myself and they are not anywhere as accurate as yours.

In those pics, I don't have Mie scattering yet. I still need to optimize the shaders; I just noticed that I have a small performance hit due to the high vertex count (the atmosphere shaders are running per vertex). But at the moment I'm focusing in "making it look right", and I've already seen many instructions that can be put outside the loops.

If you're familiar with the atmospheric scattering theory, to give you an idea, I use 4 samples for the outer integral and 2 samples for the iner integral (so 4*2=8 total). It looks rather good for such a low amount of samples IMO.

Quote:
 Original post by petrocket When you say generating normals maps, does that mean you are combining terrain normal maps with regular texture normal maps?

In fact, not yet. I'm not sure how I could combine them together. To make sense "physically", I should use bump maps instead and add the bumps to the heightmap before generating the normal map on the GPU. An alternative is to blend the two normal maps (terrain with texture) but I don't think the results will look good.

Quote:
 Original post by petrocket Do you still calculate your inverse texture space transform matrices on the cpu or on the gpu (for getting the light into texture space for your normal maps)?

The tangent space (normal and right vector) is calculated on the CPU for each vertex, the binormal is still crossed in the shader.

Quote:

That's a completely different technique now, as there's a normal map and a diffuse map for each node. The procedural texturing itself is similar but separated into many render-to-texture steps, so it's easier to have one step for the diffuse texture generation where you bind 16 diffuse textures and you blend them together from the weights stored in the lookup table (still using altitude and slope).

The shader looks like this, it's really not original:

    vec4 weights0 = texture2D(layerTableTex, vec2(slope * 0.25 + 0.00, alt2));
vec4 weights1 = texture2D(layerTableTex, vec2(slope * 0.25 + 0.25, alt2));
vec4 weights2 = texture2D(layerTableTex, vec2(slope * 0.25 + 0.50, alt2));
vec4 weights3 = texture2D(layerTableTex, vec2(slope * 0.25 + 0.75, alt2));

diffuse =   texture2D(diffTex0, diffUV) * weights0.x +
texture2D(diffTex1, diffUV) * weights0.y +
texture2D(diffTex2, diffUV) * weights0.z +
texture2D(diffTex3, diffUV) * weights0.w +
texture2D(diffTex4, diffUV) * weights1.x +
texture2D(diffTex5, diffUV) * weights1.y +
texture2D(diffTex6, diffUV) * weights1.z +
texture2D(diffTex7, diffUV) * weights1.w +
texture2D(diffTex8, diffUV) * weights2.x +
texture2D(diffTex9, diffUV) * weights2.y +
texture2D(diffTex10, diffUV) * weights2.z +
texture2D(diffTex11, diffUV) * weights2.w;


The "layerTableTex" is the slope/altitude LUT, but since there are 16 layers, it's split into 4 areas (hence the 0.25 that appear in the weights UVs), each area encodes 4 layers into RGBA.

Quote:
 Original post by petrocket You can see pictures at http://alexcpeterson.com

That's really nice, although your planets are not real sized. Spore has some competition now :)

Have you seen this paper ( Precomputed atmospheric Scattering) : http://www-ljk.imag.fr/Publications/Basilic/com.lmc.publi.PUBLI_Article@11e7cdda2f7_f64b69/index_en.html
The results they are getting look amazing

Quote:
Original post by Ysaneya

Quote:
 Original post by petrocket When you say generating normals maps, does that mean you are combining terrain normal maps with regular texture normal maps?

In fact, not yet. I'm not sure how I could combine them together. To make sense "physically", I should use bump maps instead and add the bumps to the heightmap before generating the normal map on the GPU. An alternative is to blend the two normal maps (terrain with texture) but I don't think the results will look good.

Oh, yeah, I think I meant bump maps and I wasn't sure how to blend them but I assumed if anyone could figure it out you probly could (or already had). I had always imagined using texture normal maps for things like sand, rocks and dirt when up close to the surface.

I don't suppose you have in mind a way to minimize the LOD popping due to normals changing as you change LOD levels? I have a hard time with this because it is one thing to have the texture get slightly more refined, and another thing to suddenly see lighting/shading change.

Quote:
Original post by Ysaneya
Quote:
 Original post by petrocket You can see pictures at http://alexcpeterson.com

That's really nice, although your planets are not real sized. Spore has some competition now :)

Thanks! My planet scale is definitely not on the same scale or as detailed/realistic as yours, though I wish it was that flexible. I've decided to just get it good enough for a size I think is manageable for a single player. And yes, I guess it is obvious that I am definitely influenced by the art in Spore :D

Just so you know Ysaneya, you are like a god to me. I've had to go buy a bucket for me to drool into whenever I see anything Infinity related, as well as a few rolls of duct tape to help keep my jaw off the floor. Seriously, good work!

And how do you pronounce "Ysaneya"?

Seriously, me and my coworkers are having bad trouble talking about your stuff. Couldn't you've picked a name more like Steve or something? What does that name mean anyway?

Everything looks great as usual but I can't help but notice that this game is gonna require a pretty meaty computer. I only hope it will scale well across a number of configurations.

Quote:
 Original post by ndatxcod Everything looks great as usual but I can't help but notice that this game is gonna require a pretty meaty computer. I only hope it will scale well across a number of configurations.
If he runs at a minimum of 80fps on a 8800/9800, then it ought to perfectly playable on a $100 Radeon 4830 HD, and maybe even on an$80 9600 GSO. Those aren't ridiculous cards to expect a gamer to have, and don't forget that he hasn't finished with optimisations yet...

## Create an account

Register a new account