Cloud billowing and texture generation

Started by
-1 comments, last by IFooBar 19 years, 12 months ago
I''ve included a demo exe that shows a running demo of the cloud+sky system. It''s only a sky dome with a cloud texture plastered on it. The cloud texture is generated at run time using noise and hardware smoothing. First issue is about the skydome. Usually when I think of billowing, I think iterpolating between two cloud textures. But The sky dome I implimented has screwed up texture coordinates on the north and south poles (when you start the exe north would be directly in fron of you). And it actually looked like the the clouds were changing shapes because the texture coordinates are very rugged on those areas. So I was thinking if anyone had tried apply texture coord ruiggedness to an entire skydome, and how you did that or if anyone had an algorithm to do that. The current skydome creating algorithm looks like this:

        int idx = 0;

        for( int phi = 0; phi <= 90 - mDPhi; phi += mDPhi ) 
        {
            for( int theta = 0; theta <= 360 - mDTheta; theta += mDTheta ) 
            {
                float sinPhi = std::sinf( D3DXToRadian(phi) );
                float cosPhi = std::cosf( D3DXToRadian(phi) );

                float sinTheta = std::sinf( D3DXToRadian(theta) );
                float cosTheta = std::cosf( D3DXToRadian(theta) );
                
                float sinPhiPD = std::sinf( D3DXToRadian(phi + mDPhi) );
                float cosPhiPD = std::cosf( D3DXToRadian(phi + mDPhi) );

                float sinThetaPD = std::sinf( D3DXToRadian(theta + mDTheta) );
                float cosThetaPD = std::cosf( D3DXToRadian(theta + mDTheta) );

                verts[idx].x = mRadius * sinPhi * cosTheta;
                verts[idx].y = mRadius * sinPhi * sinTheta * yscale;
                verts[idx].z = mRadius * cosPhi;

                {
                D3DXVECTOR3 v( verts[idx].x, verts[idx].y, verts[idx].z );
                D3DXVec3Normalize( &v, &v );

                    verts[idx].tu0 = hTile * (float)( std::atan2( v.x, v.z ) / (D3DX_PI * 2) ) + 0.5f;
                    verts[idx].tv0 = vTile * (float)( std::asin( v.y ) / D3DX_PI ) + 0.5f;
                }

                idx++;

                verts[idx].x = mRadius * sinPhiPD * cosTheta;
                verts[idx].y = mRadius * sinPhiPD * sinTheta * yscale;
                verts[idx].z = mRadius * cosPhiPD;
                
                {
                D3DXVECTOR3 v( verts[idx].x, verts[idx].y, verts[idx].z );
                D3DXVec3Normalize( &v, &v );

                    verts[idx].tu0 = hTile * (float)( std::atan2( v.x, v.z ) / (D3DX_PI * 2) ) + 0.5f;
                    verts[idx].tv0 = vTile * (float)( std::asin( v.y ) / D3DX_PI ) + 0.5f;
                }

                idx++;

                verts[idx].x = mRadius * sinPhi * cosThetaPD;
                verts[idx].y = mRadius * sinPhi * sinThetaPD * yscale;
                verts[idx].z = mRadius * cosPhi;
                
                {
                D3DXVECTOR3 v( verts[idx].x, verts[idx].y, verts[idx].z );
                D3DXVec3Normalize( &v, &v );

                    verts[idx].tu0 = hTile * (float)( std::atan2( v.x, v.z ) / (D3DX_PI * 2) ) + 0.5f;
                    verts[idx].tv0 = vTile * (float)( std::asin( v.y ) / D3DX_PI ) + 0.5f;
                }

                idx++;

                if( phi > -90 && phi < 90 ) 
                {
                    verts[idx].x = mRadius * sinPhiPD * cosThetaPD;
                    verts[idx].y = mRadius * sinPhiPD * sinThetaPD * yscale;
                    verts[idx].z = mRadius * cosPhiPD;
                    
                    {
                    D3DXVECTOR3 v( verts[idx].x, verts[idx].y, verts[idx].z );
                    D3DXVec3Normalize( &v, &v );

                    verts[idx].tu0 = hTile * (float)( std::atan2( v.x, v.z ) / (D3DX_PI * 2) ) + 0.5f;
                    verts[idx].tv0 = vTile * (float)( std::asin( v.y ) / D3DX_PI ) + 0.5f;
                    }

                    idx++;
                }
            }

            // The reason for this loop is explained in the skydomes pdf at spheregames.com


            for( int i = 0; i < mNumVerts - 2; ++i ) 
            { 
                if( verts[i + 0].tu0 - verts[i + 1].tu0 > 0.9f ) verts[i + 1].tu0 += 1.0f;
                if( verts[i + 1].tu0 - verts[i + 0].tu0 > 0.9f ) verts[i + 0].tu0 += 1.0f;
                if( verts[i + 0].tu0 - verts[i + 2].tu0 > 0.9f ) verts[i + 2].tu0 += 1.0f;
                if( verts[i + 2].tu0 - verts[i + 0].tu0 > 0.9f ) verts[i + 0].tu0 += 1.0f;
                if( verts[i + 1].tu0 - verts[i + 2].tu0 > 0.9f ) verts[i + 2].tu0 += 1.0f;
                if( verts[i + 2].tu0 - verts[i + 1].tu0 > 0.9f ) verts[i + 1].tu0 += 1.0f;
                if( verts[i + 0].tv0 - verts[i + 1].tv0 > 0.8f ) verts[i + 1].tv0 += 1.0f;
                if( verts[i + 1].tv0 - verts[i + 0].tv0 > 0.8f ) verts[i + 0].tv0 += 1.0f;
                if( verts[i + 0].tv0 - verts[i + 2].tv0 > 0.8f ) verts[i + 2].tv0 += 1.0f;
                if( verts[i + 2].tv0 - verts[i + 0].tv0 > 0.8f ) verts[i + 0].tv0 += 1.0f;
                if( verts[i + 1].tv0 - verts[i + 2].tv0 > 0.8f ) verts[i + 2].tv0 += 1.0f;
                if( verts[i + 2].tv0 - verts[i + 1].tv0 > 0.8f ) verts[i + 1].tv0 += 1.0f; 
            }
        }
I got this generation scheme from some paper I found on the net. You can get it at this site. Anyway, So how would I get the rugged effect and fix the insane amount of texture coordinate screwiness at the poles. (You''ll know what I mean once you run the demo app) The other issue is about the cloud texture that you see in the program. The way I generate the cloud texture is as follows: - Make N texture each a power of two greater in size then the last. - Fill each texture with colors between MinIntensity and MaxIntensity ( RGBA[diff,diff,diff,diff] ) - Use the hardware to smooth the texture a bit onto another texture surface. SmoothingAmount determines how much bigger to blow up each noise texture onto itself (2 means no blow up) - I take each noise texture, and render it onto a final texture. The largest noise texture is rendered first with BlendAmount of alpha, then the next smaller one is rendered onto the previous one with BlendAmount of alpha etc... - I take the final texture, and Render it onto itself using a MODULATE argument, so the texture is effectively squared. - Then I make a quad with a grey value (RGBA(128,128,128)) and I Render the squared texture onto yet another texture with a SUBTRACT operation between the QuaredTexture texture color and the quad''s vertex color. I get three final textures at the end which I use. the NormalTexture, the SquaredTexture, and the SubtractedTexture. The textures dont usually turn out very nice though. They look very blurry and bloated. Is there any steps I can add or delete from my current steps to generate this texture? Also the clouds you see in the demo app, this is the parameters used to initialize the cloud texture:

 tex.Init( GetGfxPtr(), 
            5, // num octaves

            512, // texture size

            0xffffff, // cloud color

            2.0f, // amount of smoothing

            120, // blend amount (how much of the previous octave is applied to the next

            1, // minimum pixel color (noise frequency minimum)

            100, // max pixel color (noise frequency maximum)

            1 ); // use source color? (renders with SRCCOLOR if true and DESTCOLOR if false)

| TripleBuffer Software |
| Plug-in Manager :: System Information Class :: C++ Debug Kit :: DirectX Tutorials :: Awesome Books |
[size=2]aliak.net

This topic is closed to new replies.

Advertisement