Jump to content
  • Advertisement

BloodLust666

Member
  • Content Count

    1000
  • Joined

  • Last visited

Community Reputation

104 Neutral

About BloodLust666

  • Rank
    Contributor
  1. BloodLust666

    GLSL Diffuse lighting doesn't work

    The specular reflection isn't the problem. My diffuse lighting isn't working correctly, but I imagine once I fix the diffuse problem, the specular issue should be fixed too. Edit: to clearify. L is my light direction, H is the half vector between the eye and the light direction Edit2: yes all my normals are normalized
  2. BloodLust666

    GLSL Diffuse lighting doesn't work

    lightPosition is a direction
  3. BloodLust666

    GLSL Diffuse lighting doesn't work

    I'm pretty sure it's in eye space because before I pass lightPosition to the shader, I get the camera's ModelView Matrix and multiply it by the light's world position. If that's eye space, then yes it's in eye space as well. lightPosition = light->GetPosition(); lightPosition.Rotate(m_camera.GetViewMatrix()); m_deferredPass2Shader.UpdateGlobalUniformData("lightPosition", cWUniformData(sWVector3(lightPosition)));
  4. BloodLust666

    GLSL Diffuse lighting doesn't work

    ok, the diffuse is working now, turns out I had a misspelling when I was updating that uniform... BUT now it seems as though the diffuse is very exaggerated. When I look up, nothing is lit (except ambiance), but when I look down everything is lit, including walls that weren't lit when I looked up. Am I doing this correclty? I'm "pretty" sure everything is in eye space... Backing up a bit, I'm doing deferred shading, on my first pass I have Vertex shader varying vec2 texCoord; varying vec3 normal; varying vec4 position; void main (void) { texCoord = gl_MultiTexCoord0.xy; normal = gl_NormalMatrix * gl_Normal; position = gl_ModelViewMatrix * gl_Vertex; gl_Position = ftransform(); } Fragment Shader varying vec2 texCoord; varying vec3 normal; varying vec4 position; uniform sampler2D texture; void main (void) { gl_FragData[0] = vec4(texture2D(texture, texCoord).rgb, 1.0); gl_FragData[1] = vec4(normal,1.0); gl_FragData[2] = vec4(position.xyz,gl_FrontMaterial.shininess); } As you can see at this step, the color and normals are in eye space. Then in my second pass, I'm pass in each light and render as a quad Vertex shader varying vec2 texCoord; void main (void) { texCoord = gl_MultiTexCoord0.xy; gl_Position = ftransform(); } Fragment shader varying vec2 texCoord; // geometry uniform sampler2D textureDiffuse; uniform sampler2D textureNormal; uniform sampler2D texturePosition; // light uniform int lightType; uniform vec3 lightPosition; uniform vec4 lightAmbient; uniform vec4 lightDiffuse; uniform vec4 lightSpecular; void main (void) { vec2 flipped = vec2(texCoord.x, 1.0 - texCoord.y);// render texture renders these upside down. This is a fix vec4 Idiff; vec4 Iamb; vec4 Ispec = vec4(0.0, 0.0, 0.0, 0.0); vec4 color = texture2D(textureDiffuse, flipped ); vec3 normal = normalize(texture2D( textureNormal, flipped ).xyz); vec4 position = texture2D(texturePosition, flipped ); float shiny = position.a; position.a = 1.0; vec3 E = normalize(- position.xyz); vec3 L; vec3 H; float NdotHV; Iamb = lightAmbient * color; if ( lightType == 1 ) // directional { L = normalize(lightPosition); H = normalize(E - L); NdotHV = max(dot(normal, H),0.0); Idiff = color * max(dot(normal,L),0.0); //Ispec = lightSpecular * color * pow(NdotHV,shiny); } else if ( lightType == 2 ) // point light (not done yet.. help?) { } gl_FragColor = Idiff + Iamb + Ispec; }
  5. I'm working with GLSL and for some reason my diffuse calculation keeps coming out to 0s... frag shader, everything is in eye space uniform vec3 lightPosition; // a directional light uniform vec4 lightAmbient; // I'm getting this data from a texture vec4 color; vec3 normal; vec3 vertex; // this is the interpolated vertex vec3 E = normalize(- vertex.xyz); vec3 L = normalize(lightPosition); vec4 Idiff = color * max(dot(normal,L),0.0); vec4 Iamb = lightAmbient * color; gl_FragColor = Iamb + Idiff; but for some reason my scene always shows ONLY the Iamb color only. I even tried substituting (normal,L) to (normal,vec3(0.0, 1.0, 0.0)) just to see what would happen and that seems to work, but that only gives the light direction in relation to the eye orientation. Every frame, I update lightPosition by multiplying the camera's viewMatrix by the light's position then sending THAT vector as the position in eye space. I know this works because on the CPU side, I have my light set at (0,1,0) and when I point the camera straight up, the new value is (0,0,-1) which is the light in eye space, so that's right. I has to be something in the shader [Edited by - BloodLust666 on April 28, 2010 12:41:45 AM]
  6. BloodLust666

    GLSL light in world coordinates

    Maybe I should pass in the camera's orientation matrix and multiply that by the lightPosition?
  7. BloodLust666

    GLSL light in world coordinates

    Shouldn't I multiply the lightPosition by the NormalMatrix? I just noticed, when I render the Normal buffer and I move the camera around, all the normals that point to the right are red, the ones pointing up are green and blue are the ones pointing towards the camera. When I rotate the camera around, those colors change according to the new orientation of the camera. Doesn't that mean that the light position needs to do the same. Maybe I'm confusing the 2, the position buffer stores all the positions in world coordinates, but the normal buffer has all the normals in viewspace coordinates... Am i getting that right? If that's correct, how do I fix my light to that?
  8. BloodLust666

    GLSL light in world coordinates

    well the thing about my lights is when I'm rendering with my lights, i'm rendering a quad to a texture and reading the geometry and normal data also from a texture, so all interpolation will be to the quad, which does no good. I need to calculate on the fragment side where the light direction is because that is what holds the true normals and positions at every pixel.
  9. BloodLust666

    GLSL light in world coordinates

    i'm pretty sure they're in world coordinates. In the vertex shader for the geometry pass normal = gl_NormalMatrix * gl_Normal; position = gl_ModelViewMatrix * gl_Vertex; then when that gets interpolated in the frag shader, I just store that in the buffer.
  10. BloodLust666

    GLSL light in world coordinates

    That didn't work either... I orginally had the lightPos uniform in the frag part. Let me step back a bit, what I have is a deferred shader and I'm ready from a texture what the position and normals are at each pixel. Is there a certain format I have to put the light into in order to act correctly with these coordinates?
  11. How do I convert a light's position into world coordinates so that when I move the camera around the light's direction doesn't move with it? so far I have varying vec3 lightPosition; uniform vec3 lightPos; void main (void) { texCoord = gl_MultiTexCoord0.xy; lightPosition = gl_ModelViewMatrix * vec4(lightPos,0.0); gl_Position = ftransform(); } varying vec3 lightPosition; void main(void) { //... lightDir = normalize(lightPosition); Idiff = max(dot(normal,lightDir),lightAmb); gl_FragColor = color*Idiff; }
  12. BloodLust666

    multiple FBOs bug

    ummmm, so I definitely have this working now... NO idea why it didn't work before. All I simply did was create a VERY simple shader that basically just passed the same data along and now it works perfectly. Any idea why this is the case? Or should I just conclude that the fixed pipeline sucks. heh :P
  13. BloodLust666

    Deferred Lighting 2nd Pass opinion

    I've gotten everything to work up to the light accumulation point. It seems that when I set the blend mode to (one, one), some of my geometry looks as if it is transparent and I can see geometry that is supposed to be behind others. Am I right in using (one, one)? Or is it something else? Here's my light accumulation shader varying vec2 texCoord; // geometry uniform sampler2D textureDiffuse; uniform sampler2D textureNormal; uniform sampler2D texturePosition; // light uniform int lightType; uniform vec3 lightPos; uniform vec4 lightAmbient; uniform vec4 lightDiffuse; uniform float lightConstant; uniform float lightLinear; uniform float lightQuadradic; uniform float lightRadius; // camera uniform vec3 cameraPosition; vec3 lightDir; vec3 lightDist; float Idiff = 0.0; vec4 Iamb; void main (void) { // flip vec2 flipped = texCoord; flipped.y = 1.0 - flipped.y; vec4 color = texture2D(textureDiffuse, flipped ); vec3 normal = normalize(texture2D( textureNormal, flipped ).xyz); vec3 position = texture2D(texturePosition, flipped ).xyz; vec3 eyeDir = normalize( - position); Iamb = lightAmbient * color; if ( lightType == 1 ) // directional { lightDir = normalize(lightPos); Idiff = max(dot(normal,lightDir),0.0); } else if ( lightType == 2 ) // positional { lightDist = lightPos - position; if ( length(lightDist) < lightRadius ) Idiff = lightDiffuse / ( lightConstant + lightLinear*lightDist + lightQuadradic*lightDist*lightDist ); } gl_FragColor = Idiff*color + Iamb; } Maybe my shader is wrong?
  14. BloodLust666

    multiple FBOs bug

    foreach RenderTarget glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, m_bufferID); // m_bufferID is 0 if window foreach Camera rendering to this target glViewport(m_viewport.left, m_viewport.top, m_viewport.width, m_viewport.height); glScissor(m_viewport.left, m_viewport.top, m_viewport.width, m_viewport.height); glClearColor(m_clearColor.x, m_clearColor.y, m_clearColor.z, m_clearColor.w); glClearDepth(1.0f); glClearStencil(0); glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT | GL_STENCIL_BUFFER_BIT ); RenderScene foreach Object foreach Material attached to object glBindTexture(GL_TEXTURE_2D, m_id); // some of these could be render targets render VBO glDrawBuffers( m_bufferList.size(), &m_bufferList[0]); I'm about 99% certain my FBOs are set up correctly because when I use only 2 rendertargets (the window, and a render texture) everything looks fine. The scene is rendered to the texture, then the texture is rendered to the screen just fine. As soon as I re-direct that texture to render to another texture, and then how THAT second texture render to the screen is when nothing shows up except for the first frame.
  15. BloodLust666

    multiple FBOs bug

    oh oops, sorry, can this can be moved to openGL please :) thanks There's a lot of actual code though and a lot of OO-ness... Maybe pseudo code could work? foreach RenderTarget bindTarget foreach Camera rendering to this target SetViewport glClear RenderScene foreach Object foreach Material attached to object bindTexture render VBO glDrawBuffers That's essentially how I have my engine structure.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!