Nature in computer graphics

Published September 07, 2007 by Shaun Kichenbrand, posted by Myopic Rhino
Do you see issues with this article? Let us know.
Advertisement

Abstract

One of the largest areas of research in computer graphics deals with natural phenomena. In almost every computer game, animated film and physics application one might see it simulated in one or more forms. This paper aims to educate the reader on different facets of nature that need to be taken into account and tries to summarise recent developments concerning each aspect. These facets include water, rain, clouds and mirages amongst others.

Introduction

In the dawn of computer graphics, little attention was truly given to nature as it was truly too complex to be simulated on any machine. Nature is merely too multi-dimensional for us to fully grasp whereas manmade objects were built with machines and can therefore be drawn with machines. But as our understanding of nature and the computing power available to us grows, we start to find ways of simulating the vastness and perfect chaos of nature. All this is achieved through the use of models which simplify nature. When using these models it is important to remember that the only constraint that we place on the model is it should seem real. The intricacies of nature are manifested in various places like fire, illumination, water, weather and wind. This paper will not be dealing with fire or wind as the prior is a vast topic on its own and the latter cannot truly be rendered per se. The structure of the paper is as follows: in Section 2 we discuss trees. Section 3 looks at water and the way it acts. Thereafter the basis will be set for us to consider weather effects, the subject of Section 4, followed by atmospheric phenomena in Section 5. Finally we make conclusions in Section 6 and consider future efforts.

Trees

Excluding water, trees are probably the most intricate objects to create realistically. Let us look at the three main tasks involved in creating trees.

Modeling

The tree modeling process can be subdivided to separate the trunk and the leaves. The trunk, including all the branches, may be modeled with one of three methods: cellular automata, fractals and L-Systems. L-Systems, as introduced by Lindenmayer[1], are probably the most widely used now. The two major methods of modeling leaves are spline definition and polygon definition. Spline definition as in Lintermann et al.[2] allows easier definitions of curved surfaces but lacks the ability to insert the jagged edges one is capable of producing with polygons as in Deussen et al.[3].

Rendering

The most important part of rendering trees has to do with illuminating it. This is also the most intricate facet thereof on account of the countless thousands of leaves and the ridged textures of barks. Mar?a et al.[4] proposes a polar-plane based method which delivers excellent results at relatively low computational times. The sky is rendered as triangle-mesh hemisphere as in Preetham et al.[5] in which a light source is placed at each vertex. Different (but not all) leaf orientations and their maximum illuminations are pre-computed and orientations that have not been calculated are then estimated by Shepard-like interpolation of the four closest neighbors. The luminance depends on the maximum luminance, which is calculated by establishing which sky-sector projects onto that orientation, and the occlusion factor, which is calculated by finding the amount of light that is blocked by other leaves.

Animation

Basic tree animation is simulated by modeling a branch and its leaves as a billboard which may sway according to external forces. This method, while physically incorrect, fools the eye extremely well from a medium distance whilst still looking relatively realistic from up close.

Water

Water, as stated above, is the most difficult natural object to handle. Not only does it move in extremely complex ways but it also refracts and reflects light both internally and externally.

Early Methods

The earliest attempts at rendering water all used bump mapping (Blinn[6]), height maps created by the perturbation of flat surfaces (Schachter[7]) and ray tracing (Newell[8]). These early techniques all lacked two important abilities of water and more specifically, waves. They could not interact or cast shadows. Later, these techniques were enhanced by the introduction of particle systems (Reeves[9]) and the development of algorithms to simulate the interaction between liquids and solids.

Recent Methods

The introduction of reflection, refraction and caustics algorithms in the early nineties finally completed the basis for rendering water. Recent works implement combinations of particles and textures in combination with fast solvers for the differential equations concerned with fluid motion. Also, Premoze et al.[10] includes whitecaps that form on water where waves break.

Rain

The two main subsets of rain rendering methods are: Particle-based and Physically-based. The first lends higher frame rates whereas the second aims at the physical correctness thereof. Rousseau et al.[11] introduces a method to realistically render rain in real-time. The shape of a raindrop (in polar coordinates) is given by:

r(?) = a(1 + C0cos(0?) + ... + C10cos(10?)X),

where Cx is the shape coefficient (which depends on the radius) for cosine distortion as obtained from Chaung et al.[12] and ? is the polar elevation from the center of the drop. The speed of rain (which we have calculated by interpolating data from [12]) may be estimated rather accurately by:

s(r) = 0.35r3 - 3.2r2 + 9.5r - 0.1,

where r is the radius of the drop and s is given in m/s. Rousseau also calculates the Fresnel factor to prove that only the outer-most 10% of a raindrop will show any reflections and their method therefore neglects the computation thereof except when near light sources. The drop is mapped with a texture that has been captured from the screen, flipped and distorted.

Clouds

Another integral part of any natural scene is the presence of clouds. In Horng-Shyang et al.[13] a method is proposed using cellular automata as in Dobashi et al.[14] which has been modified to allow simulation at run-time.

Modeling

Clouds can be modeled using particle systems[9], metaball volumes[14] or image-based modeling[15]. The metaball approach is used in [13].

Rendering

The two common techniques of rendering clouds are ray-tracing (Kayija et al.[16]) and procedural texturing (Ebert et al.[17]) but both are time consuming methods. Most rendering methods use two-pass rendering schemes in which illumination and light-dynamics are taken into account on the first render and the final image is created on the second pass. In [13] a preprocessing scheme is utilized to compute shadow relation tables (STR) and metaball lighting texture databases (MLTDB). The clouds are finally rendered in a back-to-front order by traversing the octree. The texture for each voxel is obtained by multiplying the correct entry from the MLTDB and the voxel density.

Animation

Clouds in [13] implement algorithms for calculating cloud-to-vapor and extinction probabilities. The main facet of the simulation of the cloud relies on the rules as defined in the cellular automata.

Atmospheric phenomena

Although nature contains a theoretically infinite amount of phenomena, we choose to only focus on those that occur due to the non-homogeneous (varying refraction indices) nature of the atmosphere. Diego et al.[18] discusses seven of these: inferior mirages, superior mirages, the Viking's end of the world, the green flash, the Fata Morgana and the Novaya-Zemlya effect. To render these effects properly, [18] considers three things. Firstly, Fermat's principle is used to describe the path of light through the atmosphere. This is estimated by use of fast numerical methods. An accurate model of the atmosphere is used (USA 1976 temperature profile). And finally the APM, which is developed for [18] to recreate the conditions necessary for the effects to occur.

Simulation

[18] Describes three underlying parts of the simulation algorithm.

Light Trajectory

This is calculated by extensive use of Snell's law:

n1sin?i = n2sin?j,

where ni is the index of refraction of medium i and ? is the incidence angle. The index of refraction, in turn, is calculated with:

ni = vi / c,

where vi is the speed of light in medium i and c is the speed of light in a vacuum.

Accurate atmospheric model

The 1976 US Standard Atmosphere[19] is used which defines average temperature and pressure for different latitudes. [18] Obtains the refraction indices by first calculating density:

p(h) = P(h)M / RT(h),

where P is pressure, M is the mean mass of molecules and R is the Gas constant. Then, they calculate refraction with the Gladstone-Dale[20] formula:

n(h, ?) = p(h) . (n(?) - 1) + 1

De-standardization

Finally, [18] obtains a de-standardized version of the atmosphere by using inversion layers, hot spots and noise grids.

Rendering

The combination of all these elements has allowed [18] to produce exceptionally good results with respect to realism and physical correctness.

Conclusion

This paper has presented the reader with a selection of issues that need to be taken into account when realistically rendering scenes of nature. Using some sophisticated models it is truly possible to recreate some natural scenes rather accurately. In summation: this paper has educate the reader on what needs to be done and not on the how thereof.

Future Work

A lot of research is currently going into multiprocessor support for computer image generation which is already exploited to some degree in [11]. Concerning raindrops, work is still going into the simulation of drop collision and reflection. In tree rendering, much work is still being devoted to the correct modeling of leaves and the dynamics of light through the semi-opaque surface thereof. On the subject of atmospheric phenomena, research is being done on the independence of refraction indices to aid in the animation of the effects. Finally, water dynamics are still a long way from perfect even though much progress is being made. Perhaps we should just accept that nature was never truly meant to be fully understood? Or perhaps tomorrow will hold another revelation and another faster implementation.

References

[1] Lindenmayer A. Developmental systems without cellular interaction, their languages and grammars, Parts i and ii. Journal of Theoretical Biology 1971;30:455-84.

[2] Lintermann B, Deussen O. Interactive modeling of plants. IEEE Computer Graphics and Applications 1999;19(1): 56-65.

[3] Deussen O, Colditz C, Stamminger M, Drettakis G. Interactive visualization of complex plant ecosystems. In: Proceedings of the IEEE visualization conference, IEEE; 2002.

[4] Mar?a J. Vincent, Vicente Rosell, Roberto Viv?. A polar-plane based method for natural illumination of plants and trees. In: Computers & Graphics 29 2005; 203-208

[5] Preetham J, Shirley P, Smits B. A practical analytic model for daylight. In: Computer Graphics Proceedings 1999, SIGGRAPH'99, p. 91-100.

[6] J.F. Blinn, Simulation of wrinkled surfaces, in: Proceedings of SIGGRAPH'78, Comput. Graph. 12 (3) (1978) 286-292.

[7] B. Schachter, Long crested wave models, Comput. Graph. Image Process. 12 (1980) 187-201.

[8] M.E. Newell, The Utilization of Procedure Models in Digital Image Synthesis, Ph.D. Thesis, University of Utah, Salt Lake City, UT, 1975.

[9] W.T. Reeves, Particle systems--a technique for modeling a Class of fuzzy objects, in: Proceedings of SIGGRAPH'83, Comput. Graph. 17 (3) (1983) 359-376;W.T. Reeves, Particle systems--a technique for modeling a class of fuzzy objects, ACM Trans. Graph. 2 (2) (1983) 91-108.

[10] S. Premoze, M. Ashikhmin, Rendering natural waters, in: Proceedings of Pacific Graphics'00, 2000, pp. 23-30; S. Premoze, M. Ashikhmin, Rendering natural waters, Comput. Graph. Forum 20 (4) (2001) 189-199.

[11] Pierre Rousseau, Vincent Jolivet, Djamchid Ghazanfarpour. Realistic real-time rain rendering. In: Computers & Graphics 30 2006; 507-518

[12] Ross ON. Optical remote sensing of rainfall micro-structures. Master's thesis, Freie Universita? t Berlin; 2000. In partnership with University of Auckland.

[13] Horng-Shyang Liao, Tan-Chi Ho, Jung-Hong Chuang, Cheng-Chung Lin. Fast rendering of dynamic clouds. In: Computers & Graphics 29 2005; 29-40

[14] Dobashi Y, Kaneda K, Yamashita H, Okita T, Nishita T. A simple, efficient method for realistic animation of clouds. In: Proceedings of the SIG-GRAPH'00, 2000. p. 19-28.

[15] Dobashi Y, Nishita T, Yamashita H, Okita T. Modeling of clouds from satellite images using meatballs. In: Proceedings of the sixth Pacific conference, 1998. p. 53-60.

[16] Kajiya, JT, Herzen BPV. Ray tracing volume densities. In: Proceedings of the SIGGRAPH'84, 1984. p. 165-74.

[17] Ebert DS. Procedural volumetric cloud modeling and animation. SIGGRAPH'00 Course Notes 2000;25(5):1-55.

[18] Diego Gutierrez, Francisco J. Seron, Adolfo Munoz, Oscar Anson. Simulation of atmospheric phenomena. In: Computers & Graphics

[19] USGPC, US Standard Atmosphere. United State Government Printing Office, Washington, DC; 1976.

[20] Gladstone J, Dale J. On the influence of temperature on the refraction of light. Philosophical Transactions 1858;148:887.

Cancel Save
0 Likes 0 Comments

Comments

Nobody has left a comment. You can be the first!
You must log in to join the conversation.
Don't have a GameDev.net account? Sign up!
Advertisement