Jump to content
  • Advertisement

Piyush Verma

  • Content Count

  • Joined

  • Last visited

Community Reputation

400 Neutral

About Piyush Verma

  • Rank

Personal Information

  • Role
  • Interests
  1. Piyush Verma

    Texture Masking for Pseudo-Lens Flares

      Thanks a lot for explaining it in detail. I think I should be able to implement it based on the info you provided.      I was actually planning on implementing something that is used in games/engines currently and would be a bit more flexible. That said, I think both techniques have their own uses based on the current scenario. I think I'll implement both eventually haha. Thanks again, for putting in the time to explain the solutions. 
  2. Piyush Verma

    Texture Masking for Pseudo-Lens Flares

    @Hodgin   I am not really aware of how SSBOs work yet, but I'll definitely take a look at that. But regarding the other approach you mentioned, wouldn't it be an expensive task to calculate the vector between the sun and the center of the camera, and rendering sprites based on the vector? Also, would using a uniform buffer object for sprites help with the performance?  Also, if I understand correctly, according to the GPU technique, we get the bright spots and calculate the flare geometry (positions) in the shader, and read that using the SSBO, and then render the flares using that data?  Please feel free to correct me if I interpreted your solution wrongly.    Thanks a lot for the suggestions. :)
  3. I've implemented John Chapman's Pseudo lens-flare in my OpenGL project and the result somewhat looks like this:         What I'm trying to figure out is, how can I use a simple hexagonal or a circle texture to mask each lens flare "ghost" to make look somewhat more like this:         Here's a fragment shader snippet where I'm calculating the lens flare ghosts and adding them to the sun shafts:    vec3 texture2DDistorted(sampler2D Texture, vec2 TexCoord, vec2 Direction, vec3 Distortion) { return vec3( texture2D(Texture, TexCoord + Direction * Distortion.r).r, texture2D(Texture, TexCoord + Direction * Distortion.g).g, texture2D(Texture, TexCoord + Direction * Distortion.b).b ); } // Calculate Lens flare ghosts // this is inside main() texCoord = vec2(1.0) - uv; vec2 texelSize = 1.0 / vec2(textureSize(lightScene, 0)); vec3 Distortion = vec3(-texelSize.x * distortion, -texelSize.y * distortion, texelSize.x * distortion); vec2 ghostVec = (vec2(0.5) - texCoord) * dispersal; direction = normalize(ghostVec); vec3 result = vec3(0.0); for (int i = 0; i < ghosts; i++) { vec2 offset = fract(texCoord + ghostVec * float(i)); float weight = length(vec2(0.5) - offset) / length(vec2(0.5)); result += texture2DDistorted(lightScene, offset, direction, Distortion) * weight; } // Radial gradient of 1D rainbow color texture result *= texture(lensColor, length(vec2(0.5) - texCoord) / length(vec2(0.5))).rgb; I add the result to the final color value in the shader. Is there a way to mask the color value of these ghosts with the hexagon texture so that they look more shaped like an actual lens flare rather than just blurred out blobs? I feel like it should be pretty straight forward, but at the same time I'm pretty much stumped about how to do it. 
  4. Piyush Verma

    Triangle Strip and Perlin Noise.

    Sorry about the late reply as well. So you're saying my first and the last vertices would actually be actually some dummy value. When the first line adjacency primitive is sent to the geometry shader, it sends it as [v0, v1, v2, v3] where v0 = dummy number. Is that correct?    Also, I looked a bit into perlin noise for vertices. What I couldn't figure out was, how would I animate the vertices with the perlin noise so that it looks smooth? For that, whatever noise is at v0 in the current frame, I was the same noise to be at v1 the next frame, and v1's noise goes to v2 and so on. The starting and end vertices get new noise values each frame. That way I think it would look like a smooth animation.
  5. Piyush Verma

    Triangle Strip and Perlin Noise.

    Thanks for replying! I think I get it now about moving the bands along the earth. About the lines_adjacency primitive, I haven't used it before, so I have very little experience with it. Do I have to specify the adjacent vertices in the point list through an index buffer or by specifying GL_LINES_ADJACENCY, OpenGL itself would give me 4 coordinates per primitive? 
  6. Piyush Verma

    Triangle Strip and Perlin Noise.

    Also, from what I understand, Geometry shaders store the data of a triangle, and not a mesh. Is it any way possible to calculate or manipulate vertex normals in the geometry shader for smooth shading just like show in the image in the OP? 
  7. Piyush Verma

    Triangle Strip and Perlin Noise.

      I think I get it more or less. About moving the bands along the earth, do you mean that as the earth rotates, the bands should relatively rotate around the poles too? 
  8. Hi Folks,   I am working on an Earth Simulation in OpenGL/GLSL and I've gotten pretty far until now. I was trying to find a way to render Aurora Borealis at the North and the South poles of my planet and I found this blog where the author has done something similar to what I've been trying to do. Specifically, I'm trying to imitate the following image:     In his version of Aurora Borealis, he creates 2D bands (most probably using triangle strips) and animate them in real-time using perlin noise. I have been trying to achieve this, but the only way I can think of is to: Generate the mesh once for a band. Generate noise each frame for each band and update the vertices. Generate Normals for each band using the updated vertices. Send the new vertices & normals via an OpenGL call to the GPU.  Rinse and repeat. Now, this method seems super expensive. I was hoping if anyone can suggest a way that could make it possible to introduce the noise to the vertices inside the vertex shader itself and update the normals based on the noise that we just introduced to our vertices.    I apologize that I don't have any code to show for the bands right now, since I am still brainstorming the topic and trying to come up with a most optimal solution for the problem.    I could use some ideas and would really appreciate them. 
  9. Ah got it! I'll try it out when I'm done adding other features in this project! Thanks for the suggestion! :) 
  10. Sadly, taking the difference of the two rotations and rotating the uv coord didn't work either. But the method that IYP suggested worked :     Although, I had to subtract the value instead of adding it and to the uv coord instead though, but that maybe because of the direction in which I'm rotating the sphere. But that works for now. Also you mentioned that this method won't work in case of a directional light, could you explain how would I go about implementing that so that the directional light always takes in account the angle at which the shadow is casted?  Thanks again for the help
  11. I am trying to do an earth simulation in OpenGL with GLSL shaders, and so far it's been going decent. Although I am stuck with a slightly small problem. Right now I have 3 spheres, one for ground level (earth), one for clouds and the third for the atmosphere (scattering effects). The earth sphere handles with most of the textures.   The cloud sphere is a slightly bigger sphere than the earth sphere, and is mapped with a cloud texture and normal mapped using one created with the photoshop plugin. One more thing to point out is, the rotation speed of the cloud sphere is slightly greater than the rotation speed of the earth sphere.   This is where things get confusing for me. I am trying to cast the shadow of the clouds onto the ground (earth) sphere by passing the cloud texture into the earth sphere's shader and subtracting the cloud's color from earth's color. But since the rotation speeds of the two sphere's are different, I figured if I multiplied the rotation matrix of the cloud sphere with the uv coordinates for the cloud texture, that should solve the problem. But sadly, the shadows and the clouds do not seem to rotate in sync. I was hoping if anyone can help me figure out the math to make the shadows and the cloud rotate in sync with each other, no matter how different the rotation speeds of the two sphere are. Here is my fragment shader for the earth where I'm calculating the cloud's shadow: #version 400 core uniform sampler2D day; uniform sampler2D bumpMap; uniform sampler2D night; uniform sampler2D specMap; uniform sampler2D clouds; uniform mat4 cloudRotation; in vec3 vPos; in vec3 lightVec; in vec3 eyeVec; in vec3 halfVec; in vec2 texCoord; out vec4 frag_color; void main() { vec3 normal = 2.0 * texture(bumpMap, texCoord).rgb - 1.0; //normal.z = 1 - normal.x * normal.x - normal.y * normal.y; normal = normalize ( normal ); vec4 spec = vec4(1.0, 0.941, 0.898, 1.0); vec4 specMapColor = texture2D(specMap, texCoord); vec3 L = lightVec; vec3 N = normal; vec3 Emissive = normalize(-vPos); vec3 R = reflect(-L, N); float dotProd = max(dot(R, Emissive), 0.0); vec4 specColor = spec * pow(dotProd,6.0) * 0.5; float diffuse = max(dot(N, L), 0.0); vec2 cloudTexCoord = vec2(cloudRotation * vec4(texCoord, 0.0, 1.0)); vec3 cloud_color = texture2D( clouds, cloudTexCoord).rgb; vec3 day_color = texture2D( day, texCoord ).rgb * diffuse + specColor.rgb * specMapColor.g - cloud_color * 0.25;// * (1 - cloud_color.r) + cloud_color.r * diffuse; vec3 night_color = texture2D( night, texCoord ).rgb * 0.5;// * (1 - cloud_color.r) * 0.5; vec3 color = day_color; if(dot(N, L) < 0.1) color = mix(night_color, day_color, (diffuse + 0.1) * 5.0); frag_color = vec4(color, 1.0); } Here's a sample output as a result of the above shader. Note that the shadows start out at the correct position, but the due to the wrong rotation speed, they tend to move ahead of the rotation of the cloud sphere.         Again, it would be really helpful if anyone can help me figure out the math behind keep the shadow and the clouds in sync Thanks in advance
  12. Probably a bit late on replying, but I was able to solve it by implementing texture management from scratch in my engine. Somewhere down the line, I was managing textures wrong on the C++ side and that was causing this issue. I wrote a dedicated TextureManager class to handle textures and it did the trick for me. 
  13. Yup I do. The first line in my render function is glUseProgram(program); I'll post the whole function when I get back from work...
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!