Hey there, I've got a little problem here with my shaders. I have two shaders, the first one populates a FBO with three textures: the color, the depth and the normal data. Now, the second shader should get these textures as parameters, but I cannot get the uniform location - it just return -1 when I try to get it. The first shader works nicely and I can use the textures after the draw calls are done, but I can't get the second shader to work. It does compile nicely, and no errors are raised.
Here's my "geometry pass":
// Load the shader and set the camera matrices
GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "view"), false, ref Engine.cameras.modelview);
GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "projection"), false, ref Engine.cameras.projection);
// Bind the frame buffer
// Enable depth stuff
GL.ClearColor(0f, 0f, 0f, 1f);
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);
// Render the geometry
// Clean up the geometry pass
And here's the "lighting" pass, or, the part that actually bugs:
// Set the textures for the fragment shader
GL.Uniform1(GL.GetUniformLocation(pointLightShaderId, "colorMap"), 0); // <-- Location is -1
uniform sampler2D colorMap;
in vec3 worldpos;
in vec4 screenpos;
out vec4 out_light;
// Just try to paint with magenta, though it doesn't happen <img src='http://public.gamedev.net//public/style_emoticons/<#EMO_DIR#>/sad.png' class='bbc_emoticon' alt=':(' />
out_light = vec4(1.0, 0.0, 1.0, 1.0);
Basically, I'm trying to create the needed vertices from a set of base vertices, indices and UV coordinates. I just can't get it right. Let's take a simple cube for an example: I have 8 base vertex coordinates (read from a FBX file), 36 indices for triangles and 36 UV coordinates, one for each index. Now, I need a new set of vertices with the UV coordinates mapped to them.
List<Vertex> tempVerts = new List<Vertex>();
for(int i = 0; i < indices.Count; ++i)
tempVerts.Add(vertices[indices[i]]); // 'vertices' is the list of the eight original vertices
vertices = tempVerts; // override the original list with the newly made one
I'm losing my head here, because I KNOW it's a simple thing, yet I cannot get my mind around it. Please help me before I go bald from pulling my hair! Also, someone told me last time that there should be 24 unique vertices in the final version, but I get 36. Huh?
I have some problems with texturing. Let me set up the scenario: I have a custom-written .FBX importer which reads vertices, indices and UV coordinates from a file. If I have a cube, it'll have 8 vertices, 36 indices and 36 UV coordinates, as each vertex is used for three faces.
Now, the geometry is represented nice and dandy, and if I trace the UV coordinates, they are supposed to be correct, but when I try to draw the cube with the texture, the texture is mapped in a nonsensical fashion. I cannot find the cause of this problem, and after a week of frustration, I decided to ask you guys.
I use a struct for the vertex data, which has three properties (position : float x 3, normal : float x 3, uv : float x 2). I end up having eight of these, which is correct. Then I have an integer list which keeps track of the indices, going like 0, 1, 2, 1, 2, 3 etc., you know the drill. Now, the hardest thing for me to understand is, how does OpenGL know which UV coordinate should be used, when there's only eight vertices but 36 uv coordinates? I have set up a VBO which points to all the vertex properties, and finally I draw the cube using a DrawElements call. I use a GLSL shader to texture the cube, and I pass the texture and the texture coordinates to the shader correctly (at least I assume so, because the texture shows up on the cube).
Where could the problem be? This is how it should look like: cube_correct.jpg8.17KB42 downloads
... and this is what it actually looks like: cube_problem.jpg19.52KB41 downloads
As you can see, some coordinates (the top) are correct, while the sides are somewhat "skewed". I have done no changes to my importer, and the coordinates worked just fine before I changed my rendering code to use shaders. The shader code is like this:
I've been crawling through GLSL and MRT tutorials for a few days, and everything's going nicely. Now, I have a FBO with two textures bound to it, to ColorAttachment0 and ColorAttachment1. I want to draw the diffuse (color) to the first one, and the surface normals to the second one. How is this achieved? The result I want is to have the diffuse data on the first texture (bound to ColorAttachment0) and the normal data on the second texture (bound to ColorAttachment1).
I've done this in HLSL, where I was able to specify an output variable with COLOR0, COLOR1 etc., but I haven't found anything similar in GLSL.