Methods for drawing the sky

Started by
16 comments, last by Lightness1024 11 years ago

Yes my clouds can move, in lower resolution, after any move is done, the high resolution bakes itself into an half cube (upper sky dome) using tiled rendering.

its too slow for pure real time, it would lower the system to around 1 or 2 FPS in full res full real time.

The biggest issue is the number of times the fractal needs to get evaulated per pixel. knowing the fractal is made by iterating reading a noise texture with UV coords that multiply exponentially (octaves). I calculated that at max quality one pixel = 1800 texture reads. Of course it only works so great because the texture is so small it can get in the cache. But I even suggest using registers (constant buffers) to pass an even smaller noise base, that could help I guess.

Though, in my technique the shader was so complex, on DX9 it was difficult to compile because I would exceed the max number of registers very often.

Because of the presence of loops, and dependence on variables outside the loops, creating arrays with size that multiplies with the nested loops...

Anyways, the night, sunset and dawn are handled with a mix of empirical tunings and the effects from the aerial perspective (rayleigh). The empirical stuff is mainly the intensity of the sun light, and the sky light that I use as a second source from the top. Normally you should evaluate the irradiance of the sky as a 3D function (encoded into spherical harmonics or as an lookupable envmap) but a single scalar is more than enough in the sky case because of its uniformity across the hemisphere.

The quality of the technique is that the light direction change really change the volume impression we get of the clouds. Though it is not using scattering formulas, its already slow enough :) instead it uses empirical formulas.

About the parameters, there are literally tens of them (~50), but I reduced it to 10 "master" parameters that drives the other ones with empirical ramps that I adjusted to be "cute". So the user can move the sun, time of day, or change the quantity of coverage, and it any combination it remains controlled and the best consensus. It took days to adjust.

Advertisement

Lightness1024:

Most impressive.

about how many man-hours of development time do you estimate have been spent on the system so far?

>> About the parameters, there are literally tens of them (~50), but I reduced it to 10 "master" parameters that drives the other ones with empirical ramps that I adjusted to be "cute". So the user can move the sun, time of day, or change the quantity of coverage, and it any combination it remains controlled and the best consensus. It took days to adjust.

Understandable. I had to write a small simulation routine that ran the weather engine for 300 game years to make sure it didn't go into terminal global warming or cooling. That and the world map generator were probably the hardest things to make in the game (well, avian AI has been a pain too). the weather engine required two weeks to balance - and its only like 2 screens of code! That weather engine will drive the clouds.

I've been able to glean quite a bit of info from your description: baking, constant limits, all those HLSL issues, etc. Stuff a good engineer ought to keep in mind / look out for when pursuing such a design. Most helpful. Thanks again.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

In 2006 I wrote a little weather engine with a little state machine, and it was capable of interpolating between sets of uniforms that the artists could save as presets and presented as "weather condition" in the interface, then a randomizer would make a sequence with a probability function that was evaluated using a special "distance" function between the weather state. (that made that some condition were far from other and thus rarely chosen as following each other. like snowy would rarely come after bright clear)

It was an internship at Etranges Libellules. the 6 years non disclosure is expired so I can say all I want :)

here a picture:

shot2-soir.png

In terms of rendering it is super primitive, it has a perlin noise for the clouds applied on a single flat layer, and a smoothstep (hermite) would apply a contrast curve on the alpha of the cloud to vary the density. All that driven by the weather engine. but like I said, a very small and unimpressive one.

But at the time I was super happy of my sky dome technique, it worked without any shader, just a few layers of dome layers rendered with additive blending, and the correct color gradient adjustments with help of reference photos at different moments of the day, and yay ! no cost of scattering to pay. but of course, no fog/aerial perspective on objects...

For the aforementioned, and also much more recent technique and better looking, the man hour would be around 50 days * 8 hours. (so ~400 man hours.)

But at the time I was super happy of my sky dome technique, it worked without any shader, just a few layers of dome layers rendered with additive blending, and the correct color gradient adjustments with help of reference photos at different moments of the day, and yay ! no cost of scattering to pay. but of course, no fog/aerial perspective on objects...

The gradient effects are excellent for no shaders.

For the aforementioned, and also much more recent technique and better looking, the man hour would be around 50 days * 8 hours. (so ~400 man hours.)

That would be about a month at my current schedule. Probably more like 6 weeks in reality. More time than I ought to be spending on just clouds, I still have to do the final models and animations for 100-150 animals, and all the audio. The title has been in development now for about 13 months, and I hope to have it completed in another 2 or 3 months. I'm doing the final graphics last, hence my interest in techniques for drawing "final graphics" clouds (as opposed to "placeholder").

In 2006 I wrote a little weather engine with a little state machine, and it was capable of interpolating between sets of uniforms that the artists could save as presets and presented as "weather condition" in the interface, then a randomizer would make a sequence with a probability function that was evaluated using a special "distance" function between the weather state. (that made that some condition were far from other and thus rarely chosen as following each other. like snowy would rarely come after bright clear)
It was an internship at Etranges Libellules. the 6 years non disclosure is expired so I can say all I want

In Caveman v3.0, the weather engine models rate of change of barometric pressure, warming from daylight, and warming from time of year. From these it derives barometric pressure, cloud cover, wind speed, wind direction, precipitation, water table, and flooding.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

The dobashi technique responds to humidity. i've implemented a prototype of it in ~10/15 hours, with cubic rendering (one cube per voxel), very slow and ugly. But the technique is good to generate interesting evolutions and shapes. and its totally controllable. Local humidities, pressures, temperatures... can all be fed from your weather engine.

There is a journal from Issanaya (flavien brebion) on this technique on gamedev in an old log entry. This was also showed off at GDC 2008 and there is a demo here:

http://www.simul.co.uk/truesky/multicore

It is totally using dobashi (13 years old paper), and frankly it is barely impressive when you know the technique. But the rendering is innovative and there are some ideas to be taken.

The time I took to make the fractal technique was a lot of fine tuning, polishing, optimizing and robustification towards large use cases. This may not come into account depending on the professionalism of the project. In any case, the man hour can totally be drastically reduced using a development spirit focused on time rather than quality.

The time I took to make the fractal technique was a lot of fine tuning, polishing, optimizing and robustification towards large use cases. This may not come into account depending on the professionalism of the project. In any case, the man hour can totally be drastically reduced using a development spirit focused on time rather than quality.


Instead of building the best graphics engine i can, then squeezing as much game as i can into the remaining clock cycles, i build the game first, then squeeze the best graphics i can into the remaining clock cycles - a "GAMES are for playing, PICTURES are for looking at" approach. My games usually have complex scenes and a lot of simulation going on behind the graphics (think flight sims and total war vs unreal tournament). so much so that (hold on to your hats folks...) I actually run them at a time of scale 15 fps instead of 30 fps, so i get 66 milliseconds per frame to do stuff instead of just 33 milliseconds. Way back when (1980's), when it came time to decide on a frame rate, testing showed 15 fps to be the slowest possible frame rate that didn't produce input lag. my life is an exercise in trying to shoehorn too much game into to little computer. I have a saying: "Put a Cray on every user's desk, and I'll build you a REAL game". In my current project, i can't even use skinned mesh bones animation. it too slow for the number of targets i need to draw (i'm shooting for 150-200 at once, the original version from 2000 could do 50 targets in real time FPS combat with little or no slow down). Because of all this, i usually have to settle for graphics that require less horsepower than state-of-the-art graphics do. I simply don't have the clock cycles to spare, unless i up the system requirements (which i try to keep as low as possible for maximum compatibility). So usually i'll do the best i can with the clock cycles that can be spared, hence my question about drawing times. Development time is seldom an issue. I do the graphics last so they don't become obsolete while doing the rest. I was about to license Rend386 when Microsoft bought the company, took it off the market, broke it, then released it as DirectX 1.0 a year later. This forced me to write my own perspective correct texture mapped poly engine. Took some time, but it was what was required in the way of graphics code, so i learned how to do it, and did it. just like everything else. And today I learn clouds! <g>

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

Lightness 1024:

So i take it that your overall approach is to procedurally texture map the skybox. the textures are generated images of 3D clouds, and you update the skybox at about 2 fps, correct?

and the trick is the algo used to generate the textures eh?

cloud motion in your system is motion of the 3d images in the textures, not the textures moving, correct?

so the skybox is like a big movie screen for displaying your 3D cloud skyscapes.

Norm Barrows

Rockland Software Productions

"Building PC games since 1989"

rocklandsoftware.net

PLAY CAVEMAN NOW!

http://rocklandsoftware.net/beta.php

yes you are correct :)

This topic is closed to new replies.

Advertisement