GLSL - Converting Normals from View to Worldspace

Started by
4 comments, last by OandO 12 years, 6 months ago
I have the normals for my scene stored in viewspace in a texture. I need to convert them to worldspace in a fragment shader in order to do lighting calculations. Here's an extract from near the beginning of my fragment shader: http://pastebin.com/K95WwU3S

"unorm" is the normal in viewspace; 3 floats in the range -1.0 to 1.0. I've tried outputting unorm to the screen and it appears to be correct, however "transnorm", which should be the normals in worldspace just looks like the inverse of these 3 values, which is not correct. I thought that it might be a problem with the inverse modelview matrix, but both "cameraPos" and "worldSpacePos" output the correct values, so i really have no idea what's going on here.
Advertisement
If you want the normal in worldspace dont you just multiply the viewspace normal with the view space inverse?
The modelviewmatrix is the worldspace and viewspace matrices combined right. if you take the inverse of the modelviewmatrix you put it back into objectspace.
Why not just do the light calculations in view space? Transform the light position and/or direction into view space on the CPU before sending them to the GPU and perform the calculations as normal. It even have the nice property that the camera is located at [0,0,0] in view space so you don't need to send the camera position to the GPU.
I suppose. I find it easier to get my head around things being in worldspace, so I wanted to start out doing it that way just for simplicity's sake.

I forgot to mention earlier, I got this working a while ago, but part of my project got corrupted and I ended up having to rewrite a sizeable portion of the renderer. I made a very slight alteration to how I was setting matrices, but I don't think that would cause that single line to malfunction while the rest of the shader works as it should.
Right, I'm going to need a bit of a hand with this. I've got the light position in viewspace (lightPos1), the pixel position in viewspace (viewSpacePos), and the normal of that pixel also in viewspace (unorm).

dot(unorm, normalize(lightPos1.xyz-viewSpacePos))

This is essentially how I'd been calculating diffuse light on a surface due to it's normal, except with worldspace coordinates instead. I'm guessing something about this method doesn't translate well to viewspace, because the lighting falling on a surface changes as I move the camera. I'm using both lightPos1 and viewSpacePos to calculate distance fall-off, which appears to be correct, and unorm also seems to be correct.
In the hope of getting an answer to this, because I'm thoroughly confused now, I've made some comments on the relevant bits of code and removed the irrelevant parts.

http://pastebin.com/WqLnAn8N

lightOrigin1.w and lightColour1.w are uniform intensity values which do pretty much the same thing at the moment, and not particularly important. At the moment the diffuse lighting appears brighter when I'm looking away from the light source, and darker if I look towards it.

This topic is closed to new replies.

Advertisement