Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


timothyjlaird

Member Since 16 Jun 2011
Offline Last Active Jun 15 2014 06:23 AM

Topics I've Started

advice for a web service based game w/ mobile clients

15 May 2014 - 05:02 PM

Here is my problem...I want to make a card game that can run on a web server and serve 'x' number of clients that is mostly platform agnostic on the client end (i.e. Android, IPhone, PC, Linux). I was thinking about writing a web service to act as the game server itself and then have 'apps' around it for individual platforms so that users can play the game on different devices. What are some ways I can structure this? What languages and APIs should I consider? I want to make the clients as thin as possible.

 

I've got some experience with c++, c#, Javascript and PHP and I limited experience with WCF and ASP.NET. I think the JSON format would be best for a medium but I'm open to ideas. Running it on a Microsoft web server is an option but I would like to explore Linux as well since it is a cheaper solution.

 

I know this is vague but any input on how to get started would be appreciated.


confused about tangent/bi-tangent in 'normal' mapping...

08 March 2014 - 10:06 PM

I'm trying to get a normal map working in WebGL. I'm still struggling with linear algebra and graphics theory so I've been experimenting and reading through tutorials to try to get it right. Basically I've got a sphere and a directional light. Unfortunately it obviously does not quite work if you go here and click the button it switches on the normal map which isn't quite right...

 

https://googledrive.com/host/0B-5oLVOzxNXQS0JqZjkzWHBRMVU/planet_earth_wdeploy.html

 

Shortly after putting this together I realized that I need to apply a matrix made up of normal, tangent and bi-tangent vectors to my calculations in order to fix it. I found these articles and they seem to contradict one another...

 

http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13-normal-mapping/#Normal_textures

http://voxelent.com/html/beginners-guide/1727_10/ch10_NormalMap.html

 

The first says I need to calculate both tangent and bi-tangent attributes BEFORE I send them to the vertex shader. The second says it's OK to compute the bi-tangent vector based on a cross product of the normal and tangent vectors.

 

The guy in this stackoverflow article also says that I need to calculate both and pass them to the vertex shader...

http://stackoverflow.com/questions/5255806/how-to-calculate-tangent-and-binormal

 

So who is right?


best way to do branching logic in shaders?

22 February 2014 - 02:06 PM

I am using WebGL...OpenGL ES2. This is my vertex shader. Question...

 

When I have branching in my shader like turning a 'directional light' or 'ambient light' on/off (in my case) by passing an argument to the shader (1.0 to turn it on or 0.0 to turn it off) am I better off using IF/THEN or math? What is the performance hit (if any) when using IF statements in a shader verus handling it with math? Like this...

var VSHADER_SOURCE =
	'attribute vec4 a_Position;\n' +
	'attribute vec4 a_Color;\n' +
	'attribute vec4 a_Normal;\n' +
	'uniform mat4 u_ProjMatrix;\n' +
	'uniform mat4 u_ViewMatrix;\n' +
	'uniform mat4 u_ModelMatrix;\n' +
	'uniform mat4 u_NormalMatrix;\n' +
	'uniform vec3 u_DirectionalLightColor;\n' +
	'uniform vec3 u_AmbientLightColor;\n' +
	'uniform vec3 u_DirectionalLightDirection;\n' +
	'uniform vec3 u_GlobalColor;\n' +
	'uniform float u_UseGlobalColor;\n' +
	'uniform float u_UseDirectionalLight;\n' +
	'uniform float u_UseAmbientLight;\n' +
	'varying vec4 v_Color;\n' +
	'void main() {\n' +
	'	gl_Position = u_ProjMatrix * u_ViewMatrix * u_ModelMatrix * a_Position;\n' +
	'	vec3 normal = normalize(vec3(u_NormalMatrix * a_Normal));\n' +
	'	float nDotL = max(dot(u_DirectionalLightDirection, normal), 0.0);\n' +
	'	vec3 colorToUse = u_GlobalColor + (vec3(a_Color) * u_UseGlobalColor) + (-u_GlobalColor * u_UseGlobalColor);\n' +
	'	vec3 diffuse = colorToUse + (u_DirectionalLightColor * nDotL * colorToUse * u_UseDirectionalLight) + (-colorToUse * u_UseDirectionalLight);\n' +
	'	vec3 ambient = u_UseAmbientLight * u_AmbientLightColor * colorToUse;\n' +
	'	v_Color = vec4(diffuse + ambient, a_Color.a);\n' +
	'}\n';

The end result is that setting u_UseAmbientLight or u_UseDirectionalLight to 1.0 turns them on...setting them to 0.0 turns them off.

Am I needlessly making my code harder to read or would there be an increase in performance by doing it this way (as opposed to an IF/THEN)?


need help reducing number of calls to bind/draw textures

02 February 2014 - 02:59 PM

As sort of a self-teaching exercise I'm trying to three draw quads using WebGL...each quad has its own texture with its own model matrix (rotating on its own axis). The code works but I'm trying to reduce the number of calls to the GL context and better understand how texture binding/drawing works. A picture if it helps...

 

http://i295.photobucket.com/albums/mm151/coffeeaddict21/webglexper2_ss_zps897ba55e.jpg

 

I'm re-using gl.TEXTURE0 unit for every texture...could I load each texture into its own unit (TEXTURE0, TEXTURE1, TEXTURE2) and not do all the binding over and over again?

 

This is called before each call to gl.drawArrays (draw command) to load the texture I want to use in drawArrays and I'm not sure if all of it is necessary or if there is another way around it...can I reduce the number of GL calls or optimize this?

//NAME: loadTexture
//PURPOSE: tell GL how to load the specified texture into GL sampler
//IN: WebGL context, GL texture obj, sampler handle, image object, texture unit to use
//RETURN: void
function loadTexture(gl, texture, u_Sampler, image, texUnit){
	
	//make texture unit specified active
	gl.activeTexture(texUnit);
	//bind and activate texture
	gl.bindTexture(gl.TEXTURE_2D, texture);
	//tell GL what parameters to use to map the texture to the geometry
	gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
	//tell GL to bind the specified image to the target
	gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, gl.RGB, gl.UNSIGNED_BYTE, image);
	//set specifed texture unit index to the sampler
	gl.uniform1i(u_Sampler, 0);
}

The code that calls loadTexture (see the bottom for the important stuff)...

//NAME: renderImages
//PURPOSE: load images into textures, apply tranaformations and then draw
//IN: array of pre-loaded images
//RETURN: none
function renderImages(images){
	console.log('render...');
	//get a pointer to the canvas
	var canvas = document.getElementById('webgl');
	
	//get the graphics context from the canvas
	var gl = getWebGLContext(canvas);
	if (!gl) {
		console.log('Failed to get the rendering context for WebGL');
		return;
	}
	
	//initialize shaders specified in this document
	if (!initShaders(gl, VSHADER_SOURCE, FSHADER_SOURCE)) {
		console.log('Failed to initialize shaders.');
		return;
	}
	
	//initialize vertex buffers
	var n = initVertexBuffers(gl);
	if (n < 0){
		console.log('Failed to set the position of the vertices.');
		return;
	}
	
	if(!images){
		console.log('Failed to load images.');
		return;
	}
	
	if(images.length != 3){
		console.log('Failed to load all 3 images we need.');
		return;
	}
	
	var texture = gl.createTexture();
	if(!texture){
		console.log('CreateTexture GL call failed on texture initialization.');
		return false;
	}
	
	var u_Sampler = gl.getUniformLocation(gl.program, 'u_Sampler');
	if(!u_Sampler){
		console.log('Unable to get storage location of u_Sampler during texture initialization.');
		return false;
	}
		
	var u_ModelMatrix = gl.getUniformLocation(gl.program, 'u_ModelMatrix');
	if (u_ModelMatrix == null){
		console.log('Failed to get the storage location of u_ModelMatrix');
		return;
	}
	
	var modelMatrixes = [];
	var modelMatrix1 = new Matrix4(); 
	modelMatrix1.setIdentity();
	modelMatrix1.translate(0.0, 0.5, 0.0);
	modelMatrixes.push(modelMatrix1);
	
	var modelMatrix2 = new Matrix4();
	modelMatrix2.setIdentity();
	modelMatrix2.translate(0.5, 0.0, 0.0);
	modelMatrixes.push(modelMatrix2);
	
	var modelMatrix3 = new Matrix4();
	modelMatrix3.setIdentity();
	modelMatrix3.translate(-0.5, 0.0, 0.0);
	modelMatrixes.push(modelMatrix3);
	
	//set the color to use on gl.clear
	gl.clearColor(0.0, 0.0, 0.0, 1.0);
	
	//flip the image on axis on load
	gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, 1);
	
	var lastTick1 = 0;
	var lastTick2 = 0;
	var lastTick3 = 0;
	var frameTick = 0;
	var tick = function(){
		var timeElapsed = getElapsed();
		lastTick1 = lastTick1 + timeElapsed;
		lastTick2 = lastTick2 + timeElapsed;
		lastTick3 = lastTick3 + timeElapsed;
		frameTick = frameTick + timeElapsed;
		
		//run this code when counter exceeds 100ms
		if(lastTick1 > 30){
			lastTick1 = 0;
			modelMatrix1.translate(-0.05, 0.0, 0.0);
			modelMatrix1.rotate(5, 0, 0, 1);
		}
		//run this code when counter exceeds 50ms
		if(lastTick2 > 50){
			lastTick2 = 0;
			modelMatrix2.rotate(3 * DIRECTION, 0, 0, 1);
		}
		//run this code when counter exceeds 30ms
		if(lastTick3 > 100){
			lastTick3 = 0;
			modelMatrix3.rotate(-1 * DIRECTION, 0, 0, 1);
		}
		//run this code when counter exceeds 20ms
		if(frameTick > 20){
			frameTick = 0;
			console.log('frame');
			
			//clear the color buffer
			gl.clear(gl.COLOR_BUFFER_BIT);
			
			//draw everything
			for(var matrixIndex = 0; matrixIndex < modelMatrixes.length; matrixIndex++){
				gl.uniformMatrix4fv(u_ModelMatrix, false, modelMatrixes[matrixIndex].elements);
				loadTexture(gl, texture, u_Sampler, images[matrixIndex], gl.TEXTURE0);
				gl.drawArrays(gl.TRIANGLE_STRIP, 0, n);
			}
		}
		requestAnimationFrame(tick);
	};
	tick();
}

question on using parts of single texture with the multiple instances of same geometry

26 January 2014 - 02:53 PM

What is the best way to use parts of a single texture with the multiple instances of same geometry?

 

I am using WebGL (opengl es).

 

I have this texture here...

 

smileys_128_64_zps3f7326d7.png

 

 

This is the vertex array buffer (simple quad) that I want to re-use...

//position, point size, colors, st interleaved
var vertData = new Float32Array([
        -0.25, 0.25, 		4.0,		1.0, 0.0, 0.0,		0.0, 0.5,
	-0.25, -0.25,		4.0,		0.0, 1.0, 0.0,		0.0, 0.0,
	0.25, 0.25,		4.0,		0.0, 0.0, 1.0,		0.5, 0.5,
	0.25, -0.25,		4.0,		1.0, 1.0, 0.0,		0.5, 0.0,
]);

Those (s,t) coordinates show the bottom left part of the image on rendering in WebGL in this case. What I want to do is use this as sort of a 'texture atlas' (I think that's the buzz word) so I can use a single texture for an 'n' (say >100) number of quads without having to repeat all of the vertex data each time I want to use a different part of the single texture. So what's the best way to do that? I think of some solutions but I'm kind of a novice so I don't know what is best to use. I could...

 

a) Change the shaders to pass in what part of the texture I want to use

b) Repeat the vertex buffer data four times (in this case) for each piece of the texture I want to use with different st coordinates.

c) Get rid of the 'texture atlas' and just use four (in this case) different textures.

 

How do most people solve this problem? I know it's sort of trivial with such simple geometry but it seems like it could get out of hand fast...


PARTNERS