Jump to content

  • Log In with Google      Sign In   
  • Create Account

Manabreak

Member Since 16 Nov 2011
Offline Last Active Jan 12 2014 07:11 AM

Topics I've Started

Gamedev-related tutorials

25 December 2013 - 07:47 AM

Hi! Just wanted to let you know that I'm writing some game development -related tutorials. I have a couple of them up, and I'm writing new ones whenever I have the time. :)

 

Here's my tutorial blog: http://mana-break.blogspot.com/


Cannot get uniform location from GLSL shader

01 September 2012 - 08:02 AM

Hey there, I've got a little problem here with my shaders. I have two shaders, the first one populates a FBO with three textures: the color, the depth and the normal data. Now, the second shader should get these textures as parameters, but I cannot get the uniform location - it just return -1 when I try to get it. The first shader works nicely and I can use the textures after the draw calls are done, but I can't get the second shader to work. It does compile nicely, and no errors are raised.

Here's my "geometry pass":
// Load the shader and set the camera matrices
GL.UseProgram(geometryShaderId);
GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "view"), false, ref Engine.cameras[0].modelview);
GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "projection"), false, ref Engine.cameras[0].projection);

// Bind the frame buffer
GL.BindFramebuffer(FramebufferTarget.DrawFramebuffer, fboId);

// Enable depth stuff
GL.Enable(EnableCap.DepthTest);
GL.DepthMask(true);
GL.Disable(EnableCap.Blend);

// Clear
GL.ClearColor(0f, 0f, 0f, 1f);
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);

// Render the geometry
RenderScene(e);

// Clean up the geometry pass
GL.UseProgram(0);
GL.DepthMask(false);
GL.Disable(EnableCap.DepthTest);
GL.BindFramebuffer(FramebufferTarget.FramebufferExt, 0);

And here's the "lighting" pass, or, the part that actually bugs:
GL.UseProgram(pointLightShaderId);
GL.BindFramebuffer(FramebufferTarget.FramebufferExt, fboId);
	
// Set the textures for the fragment shader
GL.Uniform1(GL.GetUniformLocation(pointLightShaderId, "colorMap"), 0); // <-- Location is -1
GL.ActiveTexture(TextureUnit.Texture0);
GL.BindTexture(TextureTarget.Texture2D, gbuffer.DiffuseTexture);

The lighting shader is initialized like this:
// Load shader files
string vText = File.ReadAllText("Resources/Shaders/pointlight_vertex.glsl");
string fText = File.ReadAllText("Resources/Shaders/pointlight_fragment.glsl");
  
// Create shader IDs
vertexId = GL.CreateShader(ShaderType.VertexShader);
fragmentId = GL.CreateShader(ShaderType.FragmentShader);
  
// Set shader sources
GL.ShaderSource(vertexId, vText);
GL.ShaderSource(fragmentId, fText);
  
// Compile vertex shader
GL.CompileShader(vertexId);
string vertexLog = GL.GetShaderInfoLog(vertexId);
int vertexStatusCode;
GL.GetShader (vertexId, ShaderParameter.CompileStatus, out vertexStatusCode);
if(vertexStatusCode != 1) Console.WriteLine("Error creating vertex shader: " + vertexLog);
  
// Compile fragment shader
GL.CompileShader(fragmentId);
string fragmentLog = GL.GetShaderInfoLog(fragmentId);
int fragmentStatusCode;
GL.GetShader(fragmentId, ShaderParameter.CompileStatus, out fragmentStatusCode);
if(fragmentStatusCode != 1) Console.WriteLine("Error creating fragment shader: " + fragmentLog);
  
// Create the shader program
shaderId = GL.CreateProgram();
GL.AttachShader(shaderId, vertexId);
GL.AttachShader(shaderId, fragmentId);
  
GL.BindAttribLocation(shaderId, 0, "in_position");
GL.BindFragDataLocation(shaderId, 0, "out_light");
  
GL.LinkProgram(shaderId);
  
int programStatusCode;
GL.GetProgram(shaderId, ProgramParameter.ValidateStatus, out programStatusCode);
if(programStatusCode == (int)All.False)
{
	  Console.WriteLine("Error creating shader program!");
	  Console.WriteLine(GL.GetProgramInfoLog(shaderId));
}
  
GL.ValidateProgram(shaderId);


... And finally, here's the lighting shaders (stubs for now). Here's the vertex shader:
#version 330


uniform mat4 projection;
uniform mat4 view;
uniform mat4 world;

layout(location = 0) in vec3 in_position;


out vec3 worldpos;
out vec4 screenpos;

void main(void)		
{  
   gl_Position = projection * view * world * vec4(0.0, 0.0, 0.0, 1.0);
   worldpos = gl_Position.xyz;
   screenpos = vec4(worldpos, 1.0);
};

... and here's the fragment shader:
#version 330


uniform sampler2D colorMap;

in vec3 worldpos;
in vec4 screenpos;
out vec4 out_light;

void main(void)		
{
   // Just try to paint with magenta, though it doesn't happen <img src='http://public.gamedev.net//public/style_emoticons/<#EMO_DIR#>/sad.png' class='bbc_emoticon' alt=':(' />
   out_light = vec4(1.0, 0.0, 1.0, 1.0);

Duplicating vertices via indices causes weirdness [SOLVED]

29 August 2012 - 09:19 AM

Okay, this is kind of a sequel to my last post: ( http://www.gamedev.n...-using-vboglsl/ )

Basically, I'm trying to create the needed vertices from a set of base vertices, indices and UV coordinates. I just can't get it right. Let's take a simple cube for an example: I have 8 base vertex coordinates (read from a FBX file), 36 indices for triangles and 36 UV coordinates, one for each index. Now, I need a new set of vertices with the UV coordinates mapped to them.

Let's assume I have vertex coordinates like this:
(1, 1, -1)
(1, -1, -1)
(-1, -1, -1)
(-1, 1, -1)
(1, 1, 1)
(1, -1, 1)
(-1, -1, 1)
(-1, 1, 1)

And indices list like this:
4,0,3
4,3,7
2,6,7
2,7,3
1,5,2
5,6,2
0,4,1
4,5,1
4,7,5
7,6,5
0,1,2
0,2,3

I tried a simple code like this:
List<Vertex> tempVerts = new List<Vertex>();
for(int i = 0; i < indices.Count; ++i)
{
tempVerts.Add(vertices[indices[i]]);  // 'vertices' is the list of the eight original vertices
}
vertices = tempVerts; // override the original list with the newly made one

.. but it cause a hell of a mess:
Attached File  cube_prob_2.jpg   9.38KB   26 downloads

... while this is how it SHOULD be:
Attached File  cube_correct.jpg   8.17KB   24 downloads

I'm losing my head here, because I KNOW it's a simple thing, yet I cannot get my mind around it. Please help me before I go bald from pulling my hair! Also, someone told me last time that there should be 24 unique vertices in the final version, but I get 36. Huh?

Confused about vertices, indices and tex coords when using VBO/GLSL

28 August 2012 - 07:14 AM

Hey there,

I have some problems with texturing. Let me set up the scenario: I have a custom-written .FBX importer which reads vertices, indices and UV coordinates from a file. If I have a cube, it'll have 8 vertices, 36 indices and 36 UV coordinates, as each vertex is used for three faces.

Now, the geometry is represented nice and dandy, and if I trace the UV coordinates, they are supposed to be correct, but when I try to draw the cube with the texture, the texture is mapped in a nonsensical fashion. I cannot find the cause of this problem, and after a week of frustration, I decided to ask you guys.

I use a struct for the vertex data, which has three properties (position : float x 3, normal : float x 3, uv : float x 2). I end up having eight of these, which is correct. Then I have an integer list which keeps track of the indices, going like 0, 1, 2, 1, 2, 3 etc., you know the drill. Now, the hardest thing for me to understand is, how does OpenGL know which UV coordinate should be used, when there's only eight vertices but 36 uv coordinates? I have set up a VBO which points to all the vertex properties, and finally I draw the cube using a DrawElements call. I use a GLSL shader to texture the cube, and I pass the texture and the texture coordinates to the shader correctly (at least I assume so, because the texture shows up on the cube).

Where could the problem be? This is how it should look like:
Attached File  cube_correct.jpg   8.17KB   41 downloads

... and this is what it actually looks like:
Attached File  cube_problem.jpg   19.52KB   40 downloads

As you can see, some coordinates (the top) are correct, while the sides are somewhat "skewed". I have done no changes to my importer, and the coordinates worked just fine before I changed my rendering code to use shaders. The shader code is like this:

[source lang="plain"]VERTEX SHADER:#version 330uniform mat4 projection;uniform mat4 view;uniform mat4 world;layout(location = 0) in vec3 in_position;layout(location = 1) in vec3 in_normal;layout(location = 2) in vec2 in_texcoord;out vec4 worldpos;out vec3 normal;out vec2 texcoord;void main(void){ texcoord = in_texcoord; normal = in_normal; worldpos = projection * view * world * vec4(in_position, 1.0);}FRAGMENT SHADER:#version 330in vec4 worldpos;in vec3 normal;in vec2 texcoord;out vec3 out_diffuse;uniform sampler2D colorMap;void main(void){ out_diffuse = texture(colorMap, texcoord).xyz;}[/source]

Fragment output to specific color attachment? (SOLVED)

18 August 2012 - 09:23 AM

Heya,

I've been crawling through GLSL and MRT tutorials for a few days, and everything's going nicely. Now, I have a FBO with two textures bound to it, to ColorAttachment0 and ColorAttachment1. I want to draw the diffuse (color) to the first one, and the surface normals to the second one. How is this achieved? The result I want is to have the diffuse data on the first texture (bound to ColorAttachment0) and the normal data on the second texture (bound to ColorAttachment1).

I've done this in HLSL, where I was able to specify an output variable with COLOR0, COLOR1 etc., but I haven't found anything similar in GLSL.

PARTNERS