Jump to content

  • Log In with Google      Sign In   
  • Create Account

The Dev Journal

Dev Journal: Detecting Tapping and Pressing inputs using Java.

Posted by , 22 March 2014 - - - - - - · 1,777 views
inputtapping, tap, pressing and 5 more...
Dev Journal: Detecting Tapping and Pressing inputs using Java. I'll be showing you a simple way of detecting different input keystrokes using Java. There are two different keystrokes for a computer keyboard, tapping and pressing. Tapping a key means the user is pressing the key and letting go immediately after. Pressing a key means the user is pressing and holding down the key until the user wishes to let go.

In Java, the fastest way of detecting key inputs is to use a class implementing the KeyListener interface, and then adding the listener to the Swing component. Fastest, but not exactly feasible for some others.

The key to detecting inputs to determine if it's tapping or pressing is by using threads. I use a thread pool for easier thread handling. Below shows the codes, while I try to explain how it works.
public class NewInputHandler implements KeyListener {
	public Map<Key, Integer> mappings = new HashMap<Key, Integer>();
	private ExecutorService threadPool = Executors.newCachedThreadPool();
	private Keys keys;
First, we need to create a thread pool in order to manage threads easily. You can tell that I used a hashmap, this is for holding the key codes per key. If the Key is, for example, the "A" key, the Integer portion will contain the key code of A, which you can obtain via
method. I also created a new class object, Keys, which holds key states, "tapped" or "pressed". More on that later on.
	public NewInputHandler(Keys keys) {
		this.keys = keys;
		mappings.put(keys.up, KeyEvent.VK_UP);
		mappings.put(keys.down, KeyEvent.VK_DOWN);
		mappings.put(keys.left, KeyEvent.VK_LEFT);
		mappings.put(keys.right, KeyEvent.VK_RIGHT);
		mappings.put(keys.W, KeyEvent.VK_W);
		mappings.put(keys.S, KeyEvent.VK_S);
		mappings.put(keys.A, KeyEvent.VK_A);
		mappings.put(keys.D, KeyEvent.VK_D);
In the code above, I pass in the Keys object, and then put all available key controls into the hashmap. This part is a bit self-explanatory, but basically the hashmap now contains the key codes for the corresponding keys.
	public void keyPressed(KeyEvent event) {
		for (Key v : mappings.keySet()) {
			if (mappings.get(v) == event.getKeyCode()) {
				if (!v.keyStateDown) {
					final Key key = v;
					key.isTappedDown = true;
					key.isPressedDown = false;
					key.keyStateDown = true;
					this.threadPool.execute(new Runnable() {
						public void run() {
							try {
							catch (InterruptedException e) {
							if (key.keyStateDown) {
								key.isPressedDown = true;
								key.isTappedDown = false;
Now, this is one half of the core of detecting tapping and pressing keys. By navigating through the hashmap, then finding the key that has been pressed, we can then be sure to edit its key states. After that, we create a new thread worker that helps determine when the user is actually tapping the keyboard, or actually pressing it. I let it sleep for 100 milliseconds, as this given value is enough to detect and tell the difference both tapping and pressing. Finally, we edit the properties of the key that was selected and active.
	public void keyReleased(KeyEvent event) {
		for (Key k : mappings.keySet()) {
			if (mappings.get(k) == event.getKeyCode()) {
				k.isPressedDown = false;
				k.isTappedDown = false;
				k.keyStateDown = false;
	public void keyTyped(KeyEvent arg0) {
		//Ignore. Used for sending Unicode character mapped as a system input.
This is the other half of the core. When a key has been released, we have to mark all occurrences of the key as false, in order to prevent input overlapping issues. Just be wary that there's a missing bracket, so people there may be anything but quiet.

So now, when you start the game, tapping (or quickly press) the touch screen will say that you have tapped a key, or pressed a key. That is it for today.

Dev Journal: 2D sprite animation using OpenGL ES 2.0 for Android

Posted by , 21 March 2014 - - - - - - · 12,767 views
2d, sprite, animation, opengl, es and 2 more...
Dev Journal: 2D sprite animation using OpenGL ES 2.0 for Android Chapter 7 - Making heavy use of Android

You probably might see that the way I write this journal is a bit different than my other entries. The fact that I'm using Chrome for Android to write this, means that I'm lacking the WYSIWYG editor most people are using. So, there won't be anything to show, even images and screenshots.

I'm currently using AIDE, also called Android IDE. It's an integrated Android development environment suited for Android programming on the go. Been using it since the Christmas holidays (when it was 50% off), so I have a pretty solid knowledge of using it, and have become a power user of Android. What you probably won't believe is that I'm using HTC Desire S to program Android apps, especially at times when I have to lie down on the bed and program away at night.

I'm going to hasten this up and move on to the most important part of this entry. There's not a lot of stuff that I wanted to share for the moment.


Chapter 8 - 2D sprite animation using OpenGL ES 2.0 on Android

In this entry, I will be showing you how to do 2D sprite animation using OpenGL ES 2.0 for Android. I use a bit of GLSL in order to manipulate the textures around and display the effects of sprites animating on the screen.

Note that in this entry, I jump around often. It's done so I can explain the steps when it is needed (a "demand-first" approach). If you get confused, please leave a comment explaining where you can't understand, and I'll try to modify the post for you at a later time.

The first thing you want to do is to create a spritesheet ("Sprite sheet" or "spritesheet"? Spelling is correct?). How you lay out your sprites is up to you, but you need to be consistent throughout the creation of many sprites you'll probably do in the later part of your project.

For me, I will be using a layout of 4x4 spritesheet, where a sprite character has 4 directions (NORTH, SOUTH, EAST, and WEST), with 4 frames of animation per direction. Each sprite character is 16x16 large, so the entire spritesheet is 64x64.

Attached Image

For demo purposes, I will use a simple sprite character named "Joe", from the Pokémon series (2nd generation, to be exact. Joe actually exists!) You can use any sprite characters you can use, regardless of size and shape.

This spritesheet should be placed in your assets folder of your Android project. You can place it in your res folder, but for the most part, I'm going to stick to the convention. You can choose to separate your sprite characters via folders within your assets folder, or in your res folder (You have to place it in /res/drawable-nodpi if you really are going for the res method.)

The next thing you want to do is to load the bitmap.
public class Art {
	public static Bitmap joe;

	public static void loadAllBitmaps(Activity activity){
		final AssetManager mgr = activity.getAssets();
		joe = load(mgr, "player/player.png");

	private static Bitmap load(final AssetManager manager, String filename){
		Bitmap result = null;
		try {
			result = BitmapFactory.decodeStream(manager.open(filename));
		catch (IOException e) {
			Log.e("Art", "Error loading art resource.", e);
		return result;
To initiate the loading, call the loadAllBitmaps() method in the constructor method of a GLSurfaceView subclass. (My preferred suggestion.) You load most of your assets this way by passing the main activity to your subclass. It not only controls the OpenGL context creation, but also allows you to control how your assets are to be loaded inside a method.
//In the main activity's onCreate(Bundle b) method.
ActivityManager mgr = (ActivityManager) this.getSystemService(Activity.ACTIVITY_SERVICE);
ConfigurationInfo info = mgr.getDeviceConfigurationInfo();
if (info.reqGlEsVersion >= 0x20000){
	renderView = new RenderView(this);

//In the RenderView class. It's a GLSurfaceView subclass.
public RenderView(MainActivity activity) {
	this.activity = activity;
	Art.loadAllBitmaps(activity);			//<-----   This is where you load your sprites.
	//... Of course there are more codes than this.
Once the sprite has been loaded, the next step is to load it in as a texture from a Bitmap.
	public static int loadTexture(Bitmap bitmap, int copy) {
		//This is to check for possible reinitialization of this object.
		//We want to prevent this from happening, otherwise there will be
		//unwanted black textures in your app.
		if (bitmap.isRecycled()) {
			Log.e("Entity", "Bitmap is already recycled. Detected too soon.");
			bitmap = Art.copy(Art.joe);

		//Texture loading.
		//This is the part where most of the time, new OpenGL programmers would want
		//to search for it on Google. I'm putting it here for future references.
		//** Of course this is for Android. **
		final int[] textureID = new int[1];
		GLES20.glGenTextures(1, textureID, 0);
		BitmapFactory.Options options = new BitmapFactory.Options();
		options.inScaled = false;
		GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureID[0]);
		GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);	
		GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0);

		//This is to make sure the bitmap is recycled, and then pass the texture ID handle
		//to somewhere else where it is safe from being Garbage Collected.
		//For the time being, I placed it in to let readers know of this.
		//The exact code itself is useless at the moment.
		copy = textureID[0];
		return textureID[0];
After converting the bitmap to a texture object the OpenGL ES can understand, we need to setup the shader codes as well as shader initialization codes.
public void loadShaderLocations() {
	this.uMatrixLocation = GLES20.glGetUniformLocation(Shader.BaseProgram, Shader.U_MATRIX);
	this.uTextureUnitLocation = GLES20.glGetUniformLocation(Shader.BaseProgram, Shader.U_TEXTURE_UNIT);
	this.aPositionLocation = GLES20.glGetAttribLocation(Shader.BaseProgram, Shader.A_POSITION);
	this.aTexturePositionLocation = GLES20.glGetAttribLocation(Shader.BaseProgram, Shader.A_TEXTURE_POSITION);
	this.uTextureMatrixLocation = GLES20.glGetUniformLocation(Shader.BaseProgram, Shader.U_TEXTURE_MATRIX);

public void setVertexAttributePointers() {
	GLES20.glVertexAttribPointer(aPositionLocation, 3, GLES20.GL_FLOAT, false, 3 * 4, vertexFloatBuffer);
	GLES20.glVertexAttribPointer(aTexturePositionLocation, 2, GLES20.GL_FLOAT, false, 2 * 4, textureCoordinatesFloatBuffer);

public void setUniforms(float[] modelMatrix, float[] textureMatrix) {
	GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, modelMatrix, 0);
	GLES20.glUniformMatrix4fv(uTextureMatrixLocation, 1, false, textureMatrix, 0);
	GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureID);	
	GLES20.glUniform1i(uTextureUnitLocation, 0);

public void loadFloatBuffers() {
	this.vertexFloatBuffer = ByteBuffer.allocateDirect(4 * 3 * 6).order(ByteOrder.nativeOrder()).asFloatBuffer();
	this.vertexFloatBuffer.put(new float[]{
		0f, 0f, 1f,
		0f, 0.5f, 1f,
		0.25f, 0f, 1f,
		0.25f, 0f, 1f,
		0f, 0.5f, 1f,
		0.25f, 0.5f, 1f

		//Graphics Ratio: 
		//Width:Height = 1:2
	this.textureCoordinatesFloatBuffer = ByteBuffer.allocateDirect(4 * 2 * 6).order(ByteOrder.nativeOrder()).asFloatBuffer();
	this.textureCoordinatesFloatBuffer.put(new float[]{
		0f, 0.25f,
		0f, 0f,
		0.25f, 0.25f,
		0.25f, 0.25f,
		0f, 0f,
		0.25f, 0f
For shaders, we want to pass in a model matrix, and a texture matrix. The model matrix determines where the entity is to be drawn. The texture matrix determines what portion of the spritesheet texture should be drawn to the mesh that is defined in the vertex buffer object. We pass in these matrices as uniforms, since they won't be modified while we're doing vertex and texture calculations. (Not sure the terminology for calculating vertex and fragment shaders be called.)
//Vertex shader code:
attribute vec4 a_position;
attribute vec2 a_texture_position;
varying vec2 v_texture_position;
uniform mat4 u_matrix;
uniform mat4 u_texture_matrix;

void main(){
	v_texture_position = (u_texture_matrix * vec4(a_texture_position, 0.0, 1.0)).xy;
	gl_Position = u_matrix * a_position;

//Fragment shader code:
precision mediump float;
varying vec2 v_texture_position;
uniform sampler2D u_texture_unit;
void main(){
	gl_FragColor= texture2D(u_texture_unit, v_texture_position);
To be honest, the texture matrix calculations can be placed in any shaders. I placed it in the vertex shader, as I believed the fragment shader is executed more than the vertex shader. We need to update the texture's UV coordinates for the fragment shader to draw. I obtain the translated U and V coordinates by matrix multiplication using the above code.

To determine texture UV coordinates of a spritesheet, we use the origin UV of the texture, which is located at the top left corner of the texture. Each side is worth 1.0, or 100% of its width and height. Half width is 50%, or 0.5. A quarter of height is worth 25%, or 0.25 of its original height.

To keep what our texture UVs are, I use 2 float variables (or 1 float array of size "2") to mark its position.
//In the sprite class object, I used a float array to keep its texture position.
protected float[] texture_uv_coordinates;
//I also used a float array to keep its vertex position.
protected float[] xy_coordinates;
I pass the texture matrix and the model matrix through to the sprite class object, and then pass both of the matrices to the shader, as mentioned earlier.
public void setUniforms(float[] modelMatrix, float[] textureMatrix) {
	GLES20.glUniformMatrix4fv(uMatrixLocation, 1, false, modelMatrix, 0);
	GLES20.glUniformMatrix4fv(uTextureMatrixLocation, 1, false, textureMatrix, 0);
Then I update both matrices.
public void tick(float[] modelMatrix, float[] textureMatrix) {
	//Input update code
	Matrix.setIdentityM(modelMatrix, 0);	
	Matrix.translateM(modelMatrix, 0, this.xy_coordinates[0], this.xy_coordinates[1], 0f);	
	if (RenderView.input.pressed){
		this.xy_coordinates[1] -= 0.001f;

	//Frame animation update code
	if (tickCounter < 0) {
		texture_uv_coordinates[1] += 0.25f;
		if (texture_uv_coordinates[1] >= 1f)
			texture_uv_coordinates[1] = 0f;	
		tickCounter = 60;
	Matrix.setIdentityM(textureMatrix, 0);
	Matrix.translateM(textureMatrix, 0, 0f, texture_uv_coordinates[1], 0f);
Ignoring some of the codes I didn't mention, there's a "tickCounter" that is used to slow down the rapidness of the sprite animation. Once the tickCounter reaches 0, it updates the texture UV coordinates by what we determined when we're creating our sprites. Then we reset the tickCounter, and repeat.

After updating the texture UV coordinates, we pass it to our shader to let it manipulate the texture drawing itself. This is the core of the sprite animation.

The input code mentioned above is for handling touch events. Once the user executes a touch event, in this case, a single tap on the screen, the sprite character moves down by 0.001 or 0.1% of the entire height of the viewport of the screen.

Once you cleaned up your project and used the above vertex and fragment shader codes, you can now start your own sprite animations, or you can go up a step and improve how animations work.

Again, if you have any questions, please leave them in the comments below, and I'll check back to try to help you out.