Is coding games for touch screens difficult?

Started by
8 comments, last by Crichton333 9 years, 11 months ago

Im oblivious to how to program for touch screens, but i was wondering, how hard is it? I'm guessing it depends on the language and OS which provides the touch screen and so on, but in general is programming for touch screens something that takes long to master? What do you have to learn? What does the average programmer already know that is used for programming for touch screens? Thank you :D

Advertisement

In general, programming for touchscreens isn't hard. It works much like mouse input. So if you can program for keyboard/mouse, you can probably handle touchscreen.

I am familiar with Unity2D, and from that perspective, it's not very different at all. There are some nuances you have to be aware of, like counting touches for example. However, there are plenty of great examples and tutorials on how to do it and it's really not that difficult.

Please check out my website for more information on my available apps: http://www.cb-indev.com/apps.html

PopQuiz!, free trial on Windows Store: http://apps.microsoft.com/windows/app/popquiz/c3e50a2f-bafb-4067-9ad0-cbaaaaa99757

PopQuiz!, on Google Play: https://play.google.com/store/apps/details?id=com.CBInDev.PopQuiz

Debouncer Demo on Google Play: https://play.google.com/store/apps/details?id=com.CB_InDev.Debouncer_Demo

Gestures can get a little weird, but most touchscreen interfaces expose stuff pretty similar to mouse events, so it tends to be straightforward.

The main difference is when you have to hand multi-touch gestures (especially multiple multi-touch gestures).

This can be handled by treating each gesture as some kind of state-machine. When detecting a touch you don't really know what the gesture will be. If the touch is immediately released, it was a tap. If it moves, it might be a scroll gesture. If there's another touch, it might be a zoom or swipe gesture. But that depends, if you are tracking multiple gestures you might want to add the touch to the nearest one, depending on its state, but if it's far away you might want to treat it as a separate gestures. Figuring out all those state machines can be a non-trivial exercise, and it's highly dependent on the exact gestures you need for your game.

openwar - the real-time tactical war-game platform

I think the real difficult part is to design for touch screen. You have to put your hand on the screen, covering part of it.You have to think at usability. A mobile game has to do more with less input, imo, compared to a PC one.

I think the real difficult part is to design for touch screen. You have to put your hand on the screen, covering part of it.You have to think at usability. A mobile game has to do more with less input, imo, compared to a PC one.

This. Coding games for touch screens is easy Apple, Google and Microsoft provide very simple easy to use APIs. Third party engines usually provide even simpler APIs. However designing a good touch screen gaming experience is not a simple task and few games have managed it.

Often games designed around touch will have gestures. Slice through this, slide across this path, drag through the middle of that. Games designed around a mouse will have clicks and drags, where the endpoints are the critical factor.

In most respects, it is just like dealing with a pattern of points. A mouse is more about the position at the moment of click, or the moment of release: "down" and "up" positions are important, but the middle is not. With touch devices, the location of "down" and "up" are less important, usually more important are the points in between.

Since it often involves slightly more processing, matching the gesture and deciding if a gesture was close enough to a range rather than just testing the coordinates at a down/up/click event, it is consequently slightly more complex.

I spent a couple years integrating touchscreen and multi-touch gestures into software. There are a few aspects to consider.

(1) Most people expect single touches to act much like mouse events. Touch down/up are like button down/up and in between are like drag events. If that's all you want, your toolkit should take care of it. The main difference is the target: people expect to be able to drag window pane content using a touch in a way they don't with a mouse.

(2) Single-touch complex ("stroke") gestures. You can handle those yourself (using the above), or use a third-party library like $1. Note that this sort of thing generally lacks discoverability, but I guess it depends on your game design.

(3) Multi-touch gestures. If you're smart you will use your toolkit to handle these and simply receive gesture begin/update/end events like you do with any other input. If you're ambitious, you can write your own gesture recognizer to integrate individual touch events over the time and space domains and worry about the combinatorics yourself.

There are some cool non-gesture multi-touch things you can do with touchscreens. Imagine a version of Pong where the paddle appears where you put your fingers, and three fingers can make a curved paddle for trick shots.

Stephen M. Webb
Professional Free Software Developer

Maybe taking in account that taping and gesturing can cramp your hand if you have to combine different actions ingame.

"Smoke me a kipper i'll be back for breakfast." -- iOS: Science Fiction Quiz

This topic is closed to new replies.

Advertisement