Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

141 Neutral

About HarriPellikka

  • Rank
  1. HarriPellikka

    Gamedev-related tutorials

    Hi! Just wanted to let you know that I'm writing some game development -related tutorials. I have a couple of them up, and I'm writing new ones whenever I have the time. :)   Here's my tutorial blog: http://mana-break.blogspot.com/
  2. HarriPellikka

    Cannot get uniform location from GLSL shader

    Ah, that was it. Thanks!
  3. Hey there, I've got a little problem here with my shaders. I have two shaders, the first one populates a FBO with three textures: the color, the depth and the normal data. Now, the second shader should get these textures as parameters, but I cannot get the uniform location - it just return -1 when I try to get it. The first shader works nicely and I can use the textures after the draw calls are done, but I can't get the second shader to work. It does compile nicely, and no errors are raised. Here's my "geometry pass": // Load the shader and set the camera matrices GL.UseProgram(geometryShaderId); GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "view"), false, ref Engine.cameras[0].modelview); GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "projection"), false, ref Engine.cameras[0].projection); // Bind the frame buffer GL.BindFramebuffer(FramebufferTarget.DrawFramebuffer, fboId); // Enable depth stuff GL.Enable(EnableCap.DepthTest); GL.DepthMask(true); GL.Disable(EnableCap.Blend); // Clear GL.ClearColor(0f, 0f, 0f, 1f); GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit); // Render the geometry RenderScene(e); // Clean up the geometry pass GL.UseProgram(0); GL.DepthMask(false); GL.Disable(EnableCap.DepthTest); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, 0); And here's the "lighting" pass, or, the part that actually bugs: GL.UseProgram(pointLightShaderId); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, fboId); // Set the textures for the fragment shader GL.Uniform1(GL.GetUniformLocation(pointLightShaderId, "colorMap"), 0); // <-- Location is -1 GL.ActiveTexture(TextureUnit.Texture0); GL.BindTexture(TextureTarget.Texture2D, gbuffer.DiffuseTexture); The lighting shader is initialized like this: // Load shader files string vText = File.ReadAllText("Resources/Shaders/pointlight_vertex.glsl"); string fText = File.ReadAllText("Resources/Shaders/pointlight_fragment.glsl"); // Create shader IDs vertexId = GL.CreateShader(ShaderType.VertexShader); fragmentId = GL.CreateShader(ShaderType.FragmentShader); // Set shader sources GL.ShaderSource(vertexId, vText); GL.ShaderSource(fragmentId, fText); // Compile vertex shader GL.CompileShader(vertexId); string vertexLog = GL.GetShaderInfoLog(vertexId); int vertexStatusCode; GL.GetShader (vertexId, ShaderParameter.CompileStatus, out vertexStatusCode); if(vertexStatusCode != 1) Console.WriteLine("Error creating vertex shader: " + vertexLog); // Compile fragment shader GL.CompileShader(fragmentId); string fragmentLog = GL.GetShaderInfoLog(fragmentId); int fragmentStatusCode; GL.GetShader(fragmentId, ShaderParameter.CompileStatus, out fragmentStatusCode); if(fragmentStatusCode != 1) Console.WriteLine("Error creating fragment shader: " + fragmentLog); // Create the shader program shaderId = GL.CreateProgram(); GL.AttachShader(shaderId, vertexId); GL.AttachShader(shaderId, fragmentId); GL.BindAttribLocation(shaderId, 0, "in_position"); GL.BindFragDataLocation(shaderId, 0, "out_light"); GL.LinkProgram(shaderId); int programStatusCode; GL.GetProgram(shaderId, ProgramParameter.ValidateStatus, out programStatusCode); if(programStatusCode == (int)All.False) { Console.WriteLine("Error creating shader program!"); Console.WriteLine(GL.GetProgramInfoLog(shaderId)); } GL.ValidateProgram(shaderId); ... And finally, here's the lighting shaders (stubs for now). Here's the vertex shader: #version 330 uniform mat4 projection; uniform mat4 view; uniform mat4 world; layout(location = 0) in vec3 in_position; out vec3 worldpos; out vec4 screenpos; void main(void) { gl_Position = projection * view * world * vec4(0.0, 0.0, 0.0, 1.0); worldpos = gl_Position.xyz; screenpos = vec4(worldpos, 1.0); }; ... and here's the fragment shader: #version 330 uniform sampler2D colorMap; in vec3 worldpos; in vec4 screenpos; out vec4 out_light; void main(void) { // Just try to paint with magenta, though it doesn't happen out_light = vec4(1.0, 0.0, 1.0, 1.0);
  4. HarriPellikka

    Mipmapping Issues

    Kinda goes against the logic and basic principals of programming, if the said line of code crashes your program even though it's not executed. Are you absolutely, positively sure it's not executed? At all? In any circumstances? Maybe you should post the related code, so we might have a better chance to get it solved.
  5. Umm, the problem is quite clear when you look the images - the geometry is f'd up. It's the same object from the exact same perspective. The order stays the same, I've checked it a million times. I've even tried disabling texturing and texture coordinates alltogether, it didn't help, so I really doubt it's the texture coordinates messing up the geometry. EDIT: Ugh, I finally solved it. I didn't notice that after I duplicated the vertices they were already in the correct order and I still used the old index array. Duh. Well, it works now
  6. Where is this written? I don't think it's true. In either case, if you're intent on using triangles, you have to make sure that your texture coordinate buffer stores the proper values for that geometry type. [/quote] There's a lot of discussion about triangles over quads: hardware optimization towards triangles, triangles being quaranteed to be on a single plane etc. Anyway, you're not really answering to the original question here. The problem here is not about texture coordinates, but the geometry, and not about using triangles or quads. Help is appreciated, if given.
  7. This felt kind of a different problem, so I decided to pop up a new thread. In fact, i'm using triangles as is the standard, quads are quite "dead" anyway.
  8. Okay, this is kind of a sequel to my last post: ( http://www.gamedev.n...-using-vboglsl/ ) Basically, I'm trying to create the needed vertices from a set of base vertices, indices and UV coordinates. I just can't get it right. Let's take a simple cube for an example: I have 8 base vertex coordinates (read from a FBX file), 36 indices for triangles and 36 UV coordinates, one for each index. Now, I need a new set of vertices with the UV coordinates mapped to them. Let's assume I have vertex coordinates like this: (1, 1, -1) (1, -1, -1) (-1, -1, -1) (-1, 1, -1) (1, 1, 1) (1, -1, 1) (-1, -1, 1) (-1, 1, 1) And indices list like this: 4,0,3 4,3,7 2,6,7 2,7,3 1,5,2 5,6,2 0,4,1 4,5,1 4,7,5 7,6,5 0,1,2 0,2,3 I tried a simple code like this: List<Vertex> tempVerts = new List<Vertex>(); for(int i = 0; i < indices.Count; ++i) { tempVerts.Add(vertices[indices]); // 'vertices' is the list of the eight original vertices } vertices = tempVerts; // override the original list with the newly made one .. but it cause a hell of a mess: [attachment=10985:cube_prob_2.jpg] ... while this is how it SHOULD be: [attachment=10986:cube_correct.jpg] I'm losing my head here, because I KNOW it's a simple thing, yet I cannot get my mind around it. Please help me before I go bald from pulling my hair! Also, someone told me last time that there should be 24 unique vertices in the final version, but I get 36. Huh?
  9. I'm not quite sure of the gist of what you are looking for, but maintaining aspect ratio shouldn't be that difficult. Supposing you have a onResize() method (or similar), that handles the window resize events, you can take the window width and divide it by window height (supposing the height != 0). That should give you the correct aspect ratio in all circumstances. If you have, let's say, a square 1:1 ratio when windowed, and want to keep it that way and add black borders to the left and right edges, I suggest to render the scene to a texture in any case. Then, just render a square plane with that texture at the center of the world. Whenever the aspect ratio is changed, the "camera" will automatically add the black background for you.
  10. Ah, I suspected something along those lines, but wasn't certain. Thanks!
  11. Hey there, I have some problems with texturing. Let me set up the scenario: I have a custom-written .FBX importer which reads vertices, indices and UV coordinates from a file. If I have a cube, it'll have 8 vertices, 36 indices and 36 UV coordinates, as each vertex is used for three faces. Now, the geometry is represented nice and dandy, and if I trace the UV coordinates, they are supposed to be correct, but when I try to draw the cube with the texture, the texture is mapped in a nonsensical fashion. I cannot find the cause of this problem, and after a week of frustration, I decided to ask you guys. I use a struct for the vertex data, which has three properties (position : float x 3, normal : float x 3, uv : float x 2). I end up having eight of these, which is correct. Then I have an integer list which keeps track of the indices, going like 0, 1, 2, 1, 2, 3 etc., you know the drill. Now, the hardest thing for me to understand is, how does OpenGL know which UV coordinate should be used, when there's only eight vertices but 36 uv coordinates? I have set up a VBO which points to all the vertex properties, and finally I draw the cube using a DrawElements call. I use a GLSL shader to texture the cube, and I pass the texture and the texture coordinates to the shader correctly (at least I assume so, because the texture shows up on the cube). Where could the problem be? This is how it should look like: [attachment=10979:cube_correct.jpg] ... and this is what it actually looks like: [attachment=10980:cube_problem.jpg] As you can see, some coordinates (the top) are correct, while the sides are somewhat "skewed". I have done no changes to my importer, and the coordinates worked just fine before I changed my rendering code to use shaders. The shader code is like this: [source lang="plain"]VERTEX SHADER: #version 330 uniform mat4 projection; uniform mat4 view; uniform mat4 world; layout(location = 0) in vec3 in_position; layout(location = 1) in vec3 in_normal; layout(location = 2) in vec2 in_texcoord; out vec4 worldpos; out vec3 normal; out vec2 texcoord; void main(void) { texcoord = in_texcoord; normal = in_normal; worldpos = projection * view * world * vec4(in_position, 1.0); } FRAGMENT SHADER: #version 330 in vec4 worldpos; in vec3 normal; in vec2 texcoord; out vec3 out_diffuse; uniform sampler2D colorMap; void main(void) { out_diffuse = texture(colorMap, texcoord).xyz; } [/source]
  12. I think the artifacts are caused by wrapping. Clamping the texture to edges should prevent this.
  13. Thanks, I actually found your web site after my last post and crawled through the examples, and got the thing finally working! It's quite sad there's so little amount of info and tutorials about modern OpenGL / GLSL. Majority of stuff is for the obsolete / deprecated techniques. Also, the ones that actually are somewhat "modern", are done in painstakingly complex ways. Thankfully, your tutorials are simple enough, but some other tutorials use abstractive layers extensively to the point that a newbie like myself must spend hours trying to find the actual thing from the code - not even mentioning the fact I don't do C++, which makes it even more frustrating to read the sources. Anyways, thanks, Lazy Foo!
  14. Thanks, it works! What about the new input system for vertex shaders? I know how to set the uniform variables, but how do I supply the vertex position and normal data? I use VBOs for my meshes, and they include the position, normal and UV coord. data, and I call the buffer data pointers before drawing the elements, but I don't know how to reach the data from the vertex shader...
  15. Heya, I've been crawling through GLSL and MRT tutorials for a few days, and everything's going nicely. Now, I have a FBO with two textures bound to it, to ColorAttachment0 and ColorAttachment1. I want to draw the diffuse (color) to the first one, and the surface normals to the second one. How is this achieved? The result I want is to have the diffuse data on the first texture (bound to ColorAttachment0) and the normal data on the second texture (bound to ColorAttachment1). I've done this in HLSL, where I was able to specify an output variable with COLOR0, COLOR1 etc., but I haven't found anything similar in GLSL.
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!