Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 18 Mar 2013
Offline Last Active Aug 19 2014 11:31 AM

Posts I've Made

In Topic: I have this collision detection concept I want to discuss

18 August 2014 - 11:50 AM

Yeah your landpad, floorpad, etc are very confusing to follow.


Are your levels largely static? A lot of iso games will create another texture map of the level defining the walkable paths - whether this is a feasible solution depends on if you are rendering a 3d world from a certain view (and if this view is changeable) or if you are using static textures for your levels, or if you are doing a tile based game.


Isometric tile based (that use 2d sprites etc) benefit from a grid system - you can then predefine different height levels for the grid. IE allow 16 different elevations which would create a grid that is the level's width by height by 16 tall. You then put all of your game objects in one of the grid spaces, and you draw them in an order that gives isometric appearance (usually start at bottom layer at upper right corner and finish on top layer at lower left corner). In this type of grid system you can give each of your objects a height - so that a tree that is 10 layers tall, for example, will not be overdrawn unless some object is at level 11 or higher.


For collision detection, you check each one of your game's dynamic objects against the tiles that it might occupy on completing the move - if something else occupies those tiles then the check fails. Usually you don't have too many dynamic objects (characters, enemies) and so this check is not very expensive.


In any case, isometric with elevation has definitely been done before - just google and you can find lots of different implementations.

In Topic: 3dTBShexagonalTroveClone

18 August 2014 - 11:15 AM

A minecraft clone with not cubes but truncated octahedrons so 3d hexagons


Already in the process of making my friend - and have been working for a year and just finishing the map editor. And I have lots of experience with coding. These things all take a really long time.


Good luck but you need to start much much smaller if you honestly want to pursue making games

In Topic: Normals question

15 August 2014 - 11:44 AM

Yeah there really is a lot of normals on the supports - but this model was unrefined and still has the original number of verts. I asked the artist to leave it high vert for now for testing purposes - I want to make sure all the vertex buffers and all my shaders can handle high vert objects. This bridge has 71,000 indices in to 60,000 verts/normals/tangents. Most of which reside in the chain supports.


We are making a hex tile map builder/editor which will can create plugins to go in to our game build and battle, which we are also working on. But anyways - the top of the bridge is fit to hexagons - as the picture below shows


Attached File  scrn.png   1.14MB   1 downloads


I hope that answers all your questions!


By the way - the bridge is obviously scaled incorrectly at this second - but thats no big deal - the bridge will likely be largely remade

In Topic: Normals question

14 August 2014 - 03:01 PM

So I solved the problem - thanks to a closer look at the shader which was inspired by the above post by Buckeye.


As mentioned at the end of the last post - the problem was within the normal map reading.. but it wasn't exactly what I thought. In the shader I have a couple bools (hasDiffuseMap, hasNormalMap, hasOpacityMap, etc.. ) that are set on a per material basis..


Well in the rendering code I was setting them to true for materials that had it, but not false for materials that didnt.. Since the same GBuffer shader is used for pretty much all materials these bools remained true even when there wasn't a map available.. this resulted in invalid texture reads for the objects that didnt have normal maps


I fixed it by setting the hasMap booleans to true or false for every single material


This fixed a lot of other strange lighting artifacts I was having with specular also

In Topic: Normals question

14 August 2014 - 02:00 PM

What format are you using for Blender export? Do you create and export all the objects in the same manner? Are you flipping the normals in Blender, or in your program?



I use dae from blender - and I was flipping the normals within the engine. That is - I just went through all normals and multiplied by negative 1. There could be some difference in the export formatting from blender - but I don't think so since dae doesn't provide a huge range of exporting options. The artist said they made the bridge in the same manner they made the tiles.





Do you set the correct light direction in your shader? If you use a dot product between the normal and the light direction, do you remember to "flip" the light direction? That is, an object should be lit when the normal and light direction are opposing (i.e., the normal points in the direction of the light source).



Yes - I did flip the direction in the shader when dotting the normal and the direction - the shader code is below.

vec4 getLightInfluence(vec4 lWorldPos, vec3 norm, Material mat)
	vec3 worldPos = lWorldPos.xyz;

	vec4 ambientColor = vec4(light.color, 1.0f) * light.ambientIntensity;
	float diffuseFactor = dot(norm,-light.direction);

	vec4 diffuseColor  = vec4(0.0, 0.0, 0.0, 0.0);
	vec4 specularColor = vec4(0.0, 0.0, 0.0, 0.0);

	float shadowFactor = getShadowFactor(lWorldPos);
	if (diffuseFactor > 0) 
	    diffuseColor = vec4(light.color, 1.0f) * light.diffuseIntensity * diffuseFactor;
            vec3 vertexToEye = normalize(camWorldPos - worldPos);
            vec3 lightReflect = normalize(reflect(light.direction,norm));
            float specularFactor = dot(vertexToEye, lightReflect);
	    specularFactor = pow(specularFactor, mat.specPower);
	    if (specularFactor > 0)
		specularColor = vec4(mat.specColor, 1.0f) * mat.specIntensity * specularFactor;

	return ambientColor + shadowFactor*(specularColor + diffuseColor);

This is a deferred renderer, and this is the direction light shader which gets all positions, material properties, normals, etc from textures. The direction is passed through a uniform and I have made sure that this direction what it should be.



Have you tried to debug examining actual values which are imported, stuffed into the vertex buffer, sent to the shader and used in the shader? If not, why not?



Well - I did in the sense that the normals drawn in the picture are the imported normals. I just created another VBO with the positions and the positions + normals and drew in GL_LINES mode - the normals look okay so I'm not sure.



I think I may have found the cause of the problem.. It looks like the artist created a normal map texture for the grass tile and not for any of the other objects.. Let me investigate a bit further but I think the problem is that I put a check for the normal texture in the shader in the wrong place, so that no matter what it is multiplying the normal retrieved from the texture by the mat3(tangent, bitangent, normal) matrix..


What I mean is that the vec3 retrieved from the normal texture is initialized to 1,1,1, and if there was no normal texture I just left it as 1,1,1 however it should just send the normal along with no alterations if there is no normal texture I think