• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.


  • Content count

  • Joined

  • Last visited

Community Reputation

141 Neutral

About HarriPellikka

  • Rank
  1. Hi! Just wanted to let you know that I'm writing some game development -related tutorials. I have a couple of them up, and I'm writing new ones whenever I have the time. :)   Here's my tutorial blog: http://mana-break.blogspot.com/
  2. Ah, that was it. Thanks!
  3. Hey there, I've got a little problem here with my shaders. I have two shaders, the first one populates a FBO with three textures: the color, the depth and the normal data. Now, the second shader should get these textures as parameters, but I cannot get the uniform location - it just return -1 when I try to get it. The first shader works nicely and I can use the textures after the draw calls are done, but I can't get the second shader to work. It does compile nicely, and no errors are raised. Here's my "geometry pass": [CODE] // Load the shader and set the camera matrices GL.UseProgram(geometryShaderId); GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "view"), false, ref Engine.cameras[0].modelview); GL.UniformMatrix4(GL.GetUniformLocation(geometryShaderId, "projection"), false, ref Engine.cameras[0].projection); // Bind the frame buffer GL.BindFramebuffer(FramebufferTarget.DrawFramebuffer, fboId); // Enable depth stuff GL.Enable(EnableCap.DepthTest); GL.DepthMask(true); GL.Disable(EnableCap.Blend); // Clear GL.ClearColor(0f, 0f, 0f, 1f); GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit); // Render the geometry RenderScene(e); // Clean up the geometry pass GL.UseProgram(0); GL.DepthMask(false); GL.Disable(EnableCap.DepthTest); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, 0); [/CODE] And here's the "lighting" pass, or, the part that actually bugs: [CODE] GL.UseProgram(pointLightShaderId); GL.BindFramebuffer(FramebufferTarget.FramebufferExt, fboId); // Set the textures for the fragment shader GL.Uniform1(GL.GetUniformLocation(pointLightShaderId, "colorMap"), 0); // <-- Location is -1 GL.ActiveTexture(TextureUnit.Texture0); GL.BindTexture(TextureTarget.Texture2D, gbuffer.DiffuseTexture); [/CODE] The lighting shader is initialized like this: [CODE] // Load shader files string vText = File.ReadAllText("Resources/Shaders/pointlight_vertex.glsl"); string fText = File.ReadAllText("Resources/Shaders/pointlight_fragment.glsl"); // Create shader IDs vertexId = GL.CreateShader(ShaderType.VertexShader); fragmentId = GL.CreateShader(ShaderType.FragmentShader); // Set shader sources GL.ShaderSource(vertexId, vText); GL.ShaderSource(fragmentId, fText); // Compile vertex shader GL.CompileShader(vertexId); string vertexLog = GL.GetShaderInfoLog(vertexId); int vertexStatusCode; GL.GetShader (vertexId, ShaderParameter.CompileStatus, out vertexStatusCode); if(vertexStatusCode != 1) Console.WriteLine("Error creating vertex shader: " + vertexLog); // Compile fragment shader GL.CompileShader(fragmentId); string fragmentLog = GL.GetShaderInfoLog(fragmentId); int fragmentStatusCode; GL.GetShader(fragmentId, ShaderParameter.CompileStatus, out fragmentStatusCode); if(fragmentStatusCode != 1) Console.WriteLine("Error creating fragment shader: " + fragmentLog); // Create the shader program shaderId = GL.CreateProgram(); GL.AttachShader(shaderId, vertexId); GL.AttachShader(shaderId, fragmentId); GL.BindAttribLocation(shaderId, 0, "in_position"); GL.BindFragDataLocation(shaderId, 0, "out_light"); GL.LinkProgram(shaderId); int programStatusCode; GL.GetProgram(shaderId, ProgramParameter.ValidateStatus, out programStatusCode); if(programStatusCode == (int)All.False) { Console.WriteLine("Error creating shader program!"); Console.WriteLine(GL.GetProgramInfoLog(shaderId)); } GL.ValidateProgram(shaderId); [/CODE] ... And finally, here's the lighting shaders (stubs for now). Here's the vertex shader: [CODE] #version 330 uniform mat4 projection; uniform mat4 view; uniform mat4 world; layout(location = 0) in vec3 in_position; out vec3 worldpos; out vec4 screenpos; void main(void) { gl_Position = projection * view * world * vec4(0.0, 0.0, 0.0, 1.0); worldpos = gl_Position.xyz; screenpos = vec4(worldpos, 1.0); }; [/CODE] ... and here's the fragment shader: [CODE] #version 330 uniform sampler2D colorMap; in vec3 worldpos; in vec4 screenpos; out vec4 out_light; void main(void) { // Just try to paint with magenta, though it doesn't happen out_light = vec4(1.0, 0.0, 1.0, 1.0); [/CODE]
  4. Kinda goes against the logic and basic principals of programming, if the said line of code crashes your program even though it's not executed. [img]http://public.gamedev.net//public/style_emoticons/default/blink.png[/img] Are you absolutely, positively sure it's not executed? At all? In any circumstances? Maybe you should post the related code, so we might have a better chance to get it solved.
  5. Umm, the problem is quite clear when you look the images - the geometry is f'd up. It's the same object from the exact same perspective. The order stays the same, I've checked it a million times. [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img] I've even tried disabling texturing and texture coordinates alltogether, it didn't help, so I really doubt it's the texture coordinates messing up the geometry. EDIT: Ugh, I finally solved it. I didn't notice that after I duplicated the vertices they were already in the correct order and I still used the old index array. Duh. Well, it works now
  6. [quote name='Goran Milovanovic' timestamp='1346266343' post='4974525'] [quote name='Manabreak' timestamp='1346259201' post='4974486'] In fact, i'm using triangles as is the standard, quads are quite "dead" anyway. [/quote] Where is this written? I don't think it's true. In either case, if you're intent on using triangles, you have to make sure that your texture coordinate buffer stores the proper values for that geometry type. [/quote] There's a lot of discussion about triangles over quads: hardware optimization towards triangles, triangles being quaranteed to be on a single plane etc. Anyway, you're not really answering to the original question here. The problem here is not about texture coordinates, but the geometry, and not about using triangles or quads. Help is appreciated, if given.
  7. [quote name='Goran Milovanovic' timestamp='1346258708' post='4974483'] Why didn't you continue this discussion in that original thread ? .... It seems fairly relevant. Anyway: The advice you were given, relating to 24 indices for 8 base vertices, assumes that you're drawing quads. So, for 8 vertices that specify the corners of some cube, you have a total of 6 faces, meaning that you need 24 indices to specify the 4 verts for each individual face (6 * 4 = 24). [/quote] This felt kind of a different problem, so I decided to pop up a new thread. In fact, i'm using triangles as is the standard, quads are quite "dead" anyway.
  8. Okay, this is kind of a sequel to my last post: ( [url="http://www.gamedev.net/topic/630360-confused-about-vertices-indices-and-tex-coords-when-using-vboglsl/"]http://www.gamedev.n...-using-vboglsl/[/url] ) Basically, I'm trying to create the needed vertices from a set of base vertices, indices and UV coordinates. I just can't get it right. Let's take a simple cube for an example: I have 8 base vertex coordinates (read from a FBX file), 36 indices for triangles and 36 UV coordinates, one for each index. Now, I need a new set of vertices with the UV coordinates mapped to them. Let's assume I have vertex coordinates like this: [CODE] (1, 1, -1) (1, -1, -1) (-1, -1, -1) (-1, 1, -1) (1, 1, 1) (1, -1, 1) (-1, -1, 1) (-1, 1, 1) [/CODE] And indices list like this: [CODE] 4,0,3 4,3,7 2,6,7 2,7,3 1,5,2 5,6,2 0,4,1 4,5,1 4,7,5 7,6,5 0,1,2 0,2,3 [/CODE] I tried a simple code like this: [CODE] List<Vertex> tempVerts = new List<Vertex>(); for(int i = 0; i < indices.Count; ++i) { tempVerts.Add(vertices[indices[i]]); // 'vertices' is the list of the eight original vertices } vertices = tempVerts; // override the original list with the newly made one [/CODE] .. but it cause a hell of a mess: [attachment=10985:cube_prob_2.jpg] ... while this is how it SHOULD be: [attachment=10986:cube_correct.jpg] I'm losing my head here, because I KNOW it's a simple thing, yet I cannot get my mind around it. Please help me before I go bald from pulling my hair! Also, someone told me last time that there should be 24 unique vertices in the final version, but I get 36. Huh?
  9. I'm not quite sure of the gist of what you are looking for, but maintaining aspect ratio shouldn't be that difficult. Supposing you have a onResize() method (or similar), that handles the window resize events, you can take the window width and divide it by window height (supposing the height != 0). That should give you the correct aspect ratio in all circumstances. If you have, let's say, a square 1:1 ratio when windowed, and want to keep it that way and add black borders to the left and right edges, I suggest to render the scene to a texture in any case. Then, just render a square plane with that texture at the center of the world. Whenever the aspect ratio is changed, the "camera" will automatically add the black background for you.
  10. Ah, I suspected something along those lines, but wasn't certain. Thanks!
  11. Hey there, I have some problems with texturing. Let me set up the scenario: I have a custom-written .FBX importer which reads vertices, indices and UV coordinates from a file. If I have a cube, it'll have 8 vertices, 36 indices and 36 UV coordinates, as each vertex is used for three faces. Now, the geometry is represented nice and dandy, and if I trace the UV coordinates, they are supposed to be correct, but when I try to draw the cube with the texture, the texture is mapped in a nonsensical fashion. I cannot find the cause of this problem, and after a week of frustration, I decided to ask you guys. I use a struct for the vertex data, which has three properties (position : float x 3, normal : float x 3, uv : float x 2). I end up having eight of these, which is correct. Then I have an integer list which keeps track of the indices, going like 0, 1, 2, 1, 2, 3 etc., you know the drill. Now, the hardest thing for me to understand is, how does OpenGL know which UV coordinate should be used, when there's only eight vertices but 36 uv coordinates? I have set up a VBO which points to all the vertex properties, and finally I draw the cube using a DrawElements call. I use a GLSL shader to texture the cube, and I pass the texture and the texture coordinates to the shader correctly (at least I assume so, because the texture shows up on the cube). Where could the problem be? This is how it should look like: [attachment=10979:cube_correct.jpg] ... and this is what it actually looks like: [attachment=10980:cube_problem.jpg] As you can see, some coordinates (the top) are correct, while the sides are somewhat "skewed". I have done no changes to my importer, and the coordinates worked just fine before I changed my rendering code to use shaders. The shader code is like this: [source lang="plain"]VERTEX SHADER: #version 330 uniform mat4 projection; uniform mat4 view; uniform mat4 world; layout(location = 0) in vec3 in_position; layout(location = 1) in vec3 in_normal; layout(location = 2) in vec2 in_texcoord; out vec4 worldpos; out vec3 normal; out vec2 texcoord; void main(void) { texcoord = in_texcoord; normal = in_normal; worldpos = projection * view * world * vec4(in_position, 1.0); } FRAGMENT SHADER: #version 330 in vec4 worldpos; in vec3 normal; in vec2 texcoord; out vec3 out_diffuse; uniform sampler2D colorMap; void main(void) { out_diffuse = texture(colorMap, texcoord).xyz; } [/source]
  12. I think the artifacts are caused by wrapping. Clamping the texture to edges should prevent this.
  13. Thanks, I actually found your web site after my last post and crawled through the examples, and got the thing finally working! It's quite sad there's so little amount of info and tutorials about modern OpenGL / GLSL. Majority of stuff is for the obsolete / deprecated techniques. Also, the ones that actually are somewhat "modern", are done in painstakingly complex ways. Thankfully, your tutorials are simple enough, but some other tutorials use abstractive layers extensively to the point that a newbie like myself must spend hours trying to find the actual thing from the code - not even mentioning the fact I don't do C++, which makes it even more frustrating to read the sources. Anyways, thanks, Lazy Foo!
  14. Thanks, it works! What about the new input system for vertex shaders? I know how to set the uniform variables, but how do I supply the vertex position and normal data? I use VBOs for my meshes, and they include the position, normal and UV coord. data, and I call the buffer data pointers before drawing the elements, but I don't know how to reach the data from the vertex shader... [img]http://public.gamedev.net//public/style_emoticons/default/sad.png[/img]
  15. Heya, I've been crawling through GLSL and MRT tutorials for a few days, and everything's going nicely. Now, I have a FBO with two textures bound to it, to ColorAttachment0 and ColorAttachment1. I want to draw the diffuse (color) to the first one, and the surface normals to the second one. How is this achieved? The result I want is to have the diffuse data on the first texture (bound to ColorAttachment0) and the normal data on the second texture (bound to ColorAttachment1). I've done this in HLSL, where I was able to specify an output variable with COLOR0, COLOR1 etc., but I haven't found anything similar in GLSL.