Pikkolini

Members
  • Content count

    20
  • Joined

  • Last visited

Community Reputation

150 Neutral

About Pikkolini

  • Rank
    Member
  1. Dynamic Sky Dome

    My skydome is of course in a static position relative to the camera :wink: The problem is that when I look up in the sky directly above me, the clouds appear very big. If my sight move more towards the horizon the clouds are getting smaller due to the perspective. Now, when I create my texture I just create it the way that I can do a normal UV-mapping on a sphere. The problem with that is, that I don't have any perspective. The clouds are always the same size, regardless to the position on the dome and it looks like my clouds are vertically crashing into the ground. My question is now, how can I achieve a correct perspective?
  2. Dynamic Sky Dome

    Hey everyone,   I want to create a sky dome with a dynamically generated cloud texture. Creating a planar cloud structure and mapping it correctly on a sphere isn't the problem but the real issue is that my clouds are pointing vertically to the horizon. I miss the effect that clouds are getting smaller in the distance (in case of a sky dome this means nearer to the horizon). I tried to flatten the dome so it's no sphere anymore and it looked a bit better but this is still no satisfying solution. So I think I have to change my texture in some way but I dont know how. Any ideas? My current cloud texture is generated with fractal brownian motion.   Thanks in advance!
  3. Sorry for the late response but I was busy with holidays :D First of all I tried the blending method, and it looked awful. [attachment=31985:clouds_blended.png] So I tried the method with the higher dimensional torus as planned. For that I used the 6D Simplex Noise of the Accidental Noise Library. Well, the results look good enough but somhow I get some strange artifacts, regular lines in an 45° angle. I suppose the noise method is responsible for that but I can't work out how, since the artifacts are hard to see. I attached four Screenshots which fairly show my problem. I hope someone has an idea what exactly causes this problem. If you wish to see some code, just say so. [attachment=31981:clouds_artifacts1.png][attachment=31982:clouds_artifacts2.png][attachment=31983:clouds_artifacts3.png][attachment=31984:clouds_artifacts4.png]
  4. Thank you for the awesome replies :D I think I will try the attempt with the higher dimensions first. Just out of curiosity, do you know any other 5D or 6D implementations of simplex noise for C/C++? I couldn't find any with google, not even the one you posted in this thread.
  5. Hello everyone,   I have a library which provides functions for 1/2/3/4-D simplex noise. With that I successfuly created a tileable 2D cloud (fBm) texture (see How do you generate tileable Perlin noise?). Now I want to create a tileable 3D cloud volume. It only has to be tileable in 2 dimension of course. How can I achieve this? Do I need higher dimension than 4D noise?  
  6. Thanks for the answer. It really cleared up a few things for me and I was able to continue on this project. I have now finished the creation of the height field and surprisingly it really looks like ocean waves and the waves moves in the right direction :D But I still got one problem which I couldn't solve after a few hours of debugging and I still don't know where to search for the origin of this problem. The amplitudes of my waves are way to high. If I use the proposed values of L=2000 and V=600 my amplitudes go up to ~10000. If I change the A in the following equation to 0.0001 my amplitudes still go up to ~1000. I really don't know why my waves are so huge but does anyone has an idea where to search for the error or what problem could cause the error?
  7. Hello everyone,   I'm currently trying to implement a FFT Wave Simulation as proposed in Jerry Tessendorf's paper <Simulating Ocean Waves>. I read a lot papers, articles, tutorials, etc. but never understood the whole process. So I just started working on my own implementation, in the hope to achieve a better understanding while doing this. Well, it didn't work out that much I hope you can help me with a few questions. For reasons of simplicity I will always refer to this blog post in my questions. 1. The equation for k is What is the big L in this equation? The article just says "size of the height field" but what exactly does this mean?   2. The sigma sign in this equation is where i use my FFT, right?   3. h_tilde and exp(ikx) are both complex numbers. My height should obviously be a real number? Do I just use the real part of this equation in the end or is there anything else I don't understand?   This three questions should be enough. I promise I have more when these are answered Thank you in advance!
  8. Your second assumption fits very well. Sorry for my incomprehensible expression. I already tried all your ideas and it helps a bit, but it doesn't solve my biggest problem. The dotted pattern is created by neighboring pixels of whom a few rays hit the depth buffer and the others don't. It doesn't matter how I do the color sampling in the case the ray has a hit when the neighboring pixels have nothing to sample.
  9. Hey, I implemented a 2D Ray Tracer for Screen Space Reflections in order to give a water surface a more realistic look. For performance reasons I only use one ray with 16 samples. This inevitably leads to a dotted pattern on the surface. The roughness of the surface slightly covers this problem but not enough. Now here is my problem. I only want to use one render pass -> no post process blur. Are there any techniques to somehow reduce the dotted pattern in this case?
  10. Hello everyone!   I recently implemented an atmosphere renderer using Nishita's method as depicted in this blog. Everything looks gorgeous but there is one thing, which puzzles me. The colors of the environment are all calculated just fine but I always assume, that the sun's color is pure white. This works fine as long as it is day but during sunsets or sunrises my color assumption leads to bright white objects while the sky is red. So I need to adjust the sun color in respect to the position of the sun. Can I somehow use Nishita's atmospheric scattering algorithm to calculate the light color? Or are there other, fast methods to calculate a realistic light color?   Thanks in advance!
  11. Lighting of 2D clouds

    Thanks for the link but I decided to try it on my own first after I got an idea My idea is that the alpha value of the pixel also represent the height of the cloud at this pixel. With this assumption I tried to raycast this fake volume like I did with 3D clouds. You can't transfer the code 1:1 of course but this is my result: #version 440 out vec4 fragColor; layout(binding = 0) uniform sampler2D cloudTexture; uniform vec3 lightDirection; uniform int numSamples; uniform float maxHeight; uniform float emptiness; uniform float absorption; in vec2 texCoords; const float maxDist = 1.73205080757; // sqrt(3) void main() { vec3 rayStart = vec3(texCoords, 0); //float t = -(dot(rayStart, planeNormal) - maxHeight) / dot(lightDirection, planeNormal); float t = maxHeight / lightDirection.z; vec3 rayStop = rayStart + t * lightDirection; float travel = distance(rayStart, rayStop); float stepSize = maxDist / float(numSamples); vec3 Step = normalize(rayStop - rayStart) * stepSize; vec3 pos = rayStart; float T = 1.0; for (int i = 0; i < numSamples && travel > 0.0; ++i, pos += Step, travel -= stepSize) { float texData = texture(cloudTexture, pos.xy).a; float alpha = (texData - emptiness) / emptiness; float height = alpha * maxHeight; if (pos.z > height) continue; T *= 1.0 - absorption * stepSize; if (T <= 0.01) break; } float texData = texture(cloudTexture, texCoords).a; float alpha = clamp((texData - emptiness) / emptiness, 0, 1); fragColor.rgb = vec3(T); fragColor.a = alpha; } The problem is, this doesn't work fine Here are some screenshots which demonstrate my problem (for demonstration purposes the alpha value is always 1 and the absorption factor is way above useful values):   Light direction is parallel to plane normal. Looks boring but ok: [attachment=26349:cloud1.PNG]   Light direction is slightly rotated. Starts to look unrealistic: [attachment=26350:cloud2.PNG]   Light direction is heavily rotated. Doesn't look like anything anymore: [attachment=26351:cloud3.PNG]   Light direction is nearly orthogonal to plane normal. This is not what I wanted: [attachment=26352:cloud4.PNG]   By now I didn't find a solution for this problem. Any ideas or is my attempt totally crap?
  12. Hello everyone,   i had the idea the make a simple cloud layer by rendering a 2D perlin noise texture and light it a bit. To my surprise i couldn't find any resources how to do the lighting of a 2D perlin noise so it looks like a cloud layer. I tried the standard lighting with the help of a normalmap but it looked aweful. Then i tried a few things like dividing the light factor by the alpha value etc. but nothing looked like a real cloud layer. Has anyone a idea how to do this? Finding resources for the lighting of a 3D perlin noise was a lot easier
  13. Lighting of a water surface

      Thanks for this nice piece of code! But I got some problems to make it work. //Flip normal,dot with LightDir, so flanks wich get lit from behind recieve some light float LD0 = max(0.0,dot(vec3(-1.0,-1.0,1.0)*N,LightDir)); All my directions are transformed to camera space so I assume that I have to multiply the flip vector with the ModelViewMat, right? My current attempt: //Flip normal,dot with LightDir, so flanks wich get lit from behind recieve some light vec3 flip = (ModelView * vec4(-1.0,-1.0,1.0,0.0)).xyz; float LD0 = max(0.0,dot(flip*N,LightDir)); ______________________________________________________________ //How much do we look into the Lightdirection ? float LD1 = max(0.0, dot(-eye, light) * 0.6 + 0.4); This value gets very often zero. This are the values of LD1 if the sun is in front of me: [attachment=25966:sun_front.PNG] on it's highest point: [attachment=25967:sun_top.PNG] behind me: [attachment=25965:sun_back.PNG] Since you multiply LD1 with the other values to get a result the function returns a value near zero in the most cases. How do you use the result of this function? My current attempt is vec3 color = fresnel * reflectionColor + invfresnel * transmissionColor * computeSSS() + specularLight * pow(cosAlpha, 96); But this leads to a very black water surface and doesn't look nice. Screenshot with the attempt to use your function: [attachment=25969:withSSS.PNG] Screenshot before the attempt: [attachment=25968:withoutSSS.PNG] return saturate(LD3*LD3*4.0+LD1*0.125); OpenGL doesn't know a saturate function. Does this line intend to do the same as return clamp(LD3*LD3*4.0+LD1*0.125, 0.0, 1.0); ?
  14. Hello everyone,   I want to simulate realistic looking lighting of a water surface. The normal for each pixel, the light direction and the eye direction are all given. I already successfully implemented the specular part of the lighting and now I am struggling with the "diffuse" part. If I use the standard method for diffuse lighting the water surface looks way too dark. I think it is because the light in the water is scattered and not just reflected. Now I am searching for a method that only darkens the flank of the wave which normal is turned away from the light directions. Do you have any keywords or ideas?
  15. Clipping on a wavy surface

    First of all here are two screenshots to display my problem: [sharedmedia=gallery:images:6130] [sharedmedia=gallery:images:6131]   In the first screenshot the wave is above the clipping plane. In the second screenshot the wave is under the clipping plane. My general question would be: How can I make it look right? I know that I have to translate the reflection texture to make it flush with the object. And I know that the reflection is either too much or too less clipped. My initial question is how can I solve the problem with the clipping? I continued to work on this problem and now I am at the following point: I tried to render the surface from the view of the reflection camera and test the depth of this image against the depth of the reflection texture. But what should I do if the depth test fails? There is no way how the shader can find out which color should be used as a replacement. Is another object behind the clipped object or is just the background behind the clipped object? I don't know. I hope that I could make the topic a bit more clear and REALLY hope that somebody has an idea