Sign in to follow this  
Sik_the_hedgehog

Game controls

Recommended Posts

So, I was thinking something. Until not long ago, the only kind of devices we would expect (most) games to be played with were either a keyboard, a mouse or a controller. Of course, this meant that coding and design was relatively easy (especially since the mouse was unlikely to be used without a keyboard, so we could safely assume all controls could be button based).

That isn't true anymore, though. Now we have many new types of input devices: touch screens (both single touch and multitouch), accelerometers, motion sensing (both with a controller like the Wii or without one like Kinect), and even some more exotic devices like Emotiv's headset (essentially a brainwave reader). I bet I'm probably missing something else as well. And all of those controllers are used in completely different ways. I suppose that from the coding side this won't be much of an issue (probably programmers will just isolate the input code into a separate module that can be reused in all games), but what about the design side?

I'm not even talking about trying to exploit the capabilities of the new devices, but rather how to adapt games to work with all of them (sooner or later we'll need to do this eventually). Let's take for example a 2D platformer. Let's suppose that for jumping, we go with the spacebar for a keyboard and the "move up" action for a headset. For crouching, we go with the down arrow on the keyboard and "shrink" for a headset. So far it looks intuitive, right?

Now let's say we want to add a slide move by pressing crouch+jump. On a keyboard, that'd mean hold Down and press Space. Doesn't look wrong, does it? Now, on the headset, that'd mean shrink, then try to move up... Yeah, that doesn't look right.

How would you handle all this? Looks like the idea of customizable controls will have to go beyond the basic controls and we'll also need to provide customization of the advanced moves as well.

Share this post


Link to post
Share on other sites
The headset isn't appropriate for a 2D platformer. That's pretty much all their is to it.

The question isn't really "How can X be done on device Y?" but rather "Is doing X on device Y even a good idea?"

Share this post


Link to post
Share on other sites
[quote name='Sik_the_hedgehog' timestamp='1302443987' post='4796669']
How would you handle all this? Looks like the idea of customizable controls will have to go beyond the basic controls and we'll also need to provide customization of the advanced moves as well.
[/quote]

To answer the question... I would not.

The new controllers that have come out recently need to have software made specifically designed for them and that's the way it should be. There's not much point in trying to give the player the same experience with the new devices, the new devices are really about giving the player NEW experiences. For example when the PS1 lightgun came out the designers didn't try to recreate existing experiences for the controller, they used the controller to create new experiences.

Share this post


Link to post
Share on other sites
[quote name='Sik_the_hedgehog' timestamp='1302443987' post='4796669']
I'm ... talking about ... how to adapt games to work with all of them (sooner or later we'll need to do this eventually).[/quote]
Why? Why do we need to adapt old games, that aren't suited for new hardware, to that new hardware? Why not just create new games?

Share this post


Link to post
Share on other sites
[quote name='MeshGearFox' timestamp='1302448137' post='4796698']The headset isn't appropriate for a 2D platformer. That's pretty much all their is to it.[/quote]
While I agree that a normal person would never think on playing a platformer with anything that isn't a button-based device, somebody with severe mobility impairments (e.g. somebody with quadriplegia) may...

[quote name='Tom Sloper' timestamp='1302449106' post='4796704'][quote name='Sik_the_hedgehog' timestamp='1302443987' post='4796669']
I'm ... talking about ... how to adapt games to work with all of them (sooner or later we'll need to do this eventually).[/quote]
Why? Why do we need to adapt old games, that aren't suited for new hardware, to that new hardware? Why not just create new games?[/quote]
I never said we'd adapt old games, just new ones we make =P But as you can see in my above reply, accessibility plays a big role into all this (and yes, this is something we should care about), so we may want to provide the ability to play with as many devices as possible (maybe even through an external API if needed).

Share this post


Link to post
Share on other sites
I think you would have to approach this on a per controller and per game basis.

In the case of the Emotive headset from this [url="http://www.youtube.com/watch?v=40L3SGmcPDQ"]video link[/url] it is apparent that you don't control things by moving your head or anything like that, but by literally associating thoughts with actions (which is so friggin' cool I want one :D ). So having to look up and down at the same time wouldn't be an issue.

Focussing on the platformer scenario there is no reason why you couldn't control it using an Emotiv headset. And there is no reason why we shouldn't associate a separate thought for the sliding action. After all up + down = slide doesn't actually make much sense.

I think a more interesting scenario for the Emotiv headset would be combinatorial gameplay like in Magicka where you have to combine up to 8 separate elements together to make spells, with a massive number of possible combinations. And with so many combinations you do not want to bind a separate thought for each combination. With keyboard and mouse you can conjure more than one element at once by pressing two keys at once. In the case of the Emotive headset it only registers predefined thought associations and only one at a time obviously... So at first glance it seems that keyboard and mouse would be more effective. (Actually this is the case when comparing the keyboard to the controller too). However thinking about it a bit more, if the speed at which it can register thought associations is fast enough then the difference in speed between the Emotive headset and the controller would not be significant. In fact it might even provide an advantage due to bypassing the time required to action a instinctive motor skill such as perssing the right keys in the right order, with the time required to think a thought, i.e in this case it might actually be faster and more reliable than keyboard and mouse.

EDIT: continuing the Magicka and Emotiv analysis, the tricky part comes when you try to take into account moving and conjuring of elements... and turning at the same time as walking... I would have to see how the designers of the designers of the device would implement such control.

Share this post


Link to post
Share on other sites
[quote name='forsandifs' timestamp='1302452723' post='4796722']In the case of the Emotive headset from this [url="http://www.youtube.com/watch?v=40L3SGmcPDQ"]video link[/url] it is apparent that you don't control things by moving your head or anything like that, but by literally associating thoughts with actions (which is so friggin' cool I want one :D ). So having to look up and down at the same time wouldn't be an issue.[/quote]
Except it doesn't work that way. It doesn't read your thoughts, but rather it tries to read the same kind of signals the brain would send to the muscles to move them. This works because the brain can treat the entity on screen as an extension of the body. This would mean that in the example I gave, it'd be equivalent to trying to move a part of your body up and down at the same time... you just can't (though no idea how it works if there isn't a physical impediment - maybe the brain would consider it like stretching the entity vertically?).

In any case, my point is that such an action doesn't make sense at all and would only bring in confusion (and thereby unjustified frustation) to the player.

[quote name='forsandifs' timestamp='1302452723' post='4796722']Focussing on the platformer scenario there is no reason why you couldn't control it using an Emotiv headset. And there is no reason why we shouldn't associate a separate thought for the sliding action. After all up + down = slide doesn't actually make much sense.[/quote]
Which is what I was getting at in the first post. Actually shrink + move left/right would make more sense. Coming to think on it, maybe for the specific case of the headset it's probably a better idea to just hardcode the moves since there's only one combination that makes sense for a given move, really.

[quote name='forsandifs' timestamp='1302452723' post='4796722']In the case of the Emotive headset it only registers predefined thought associations and only one at a time obviously...[/quote]
I was under the impression the headset could read multiple signals at the same time (in fact this would be required to support diagonals in the direction-based actions, for example).

Share this post


Link to post
Share on other sites
[quote name='Sik_the_hedgehog' timestamp='1302454638' post='4796727']Except it doesn't work that way. It doesn't read your thoughts, but rather it tries to read the same kind of signals the brain would send to the muscles to move them. This works because the brain can treat the entity on screen as an extension of the body. This would mean that in the example I gave, it'd be equivalent to trying to move a part of your body up and down at the same time... you just can't (though no idea how it works if there isn't a physical impediment - maybe the brain would consider it like stretching the entity vertically?).[/quote]

1. The way you calibrate the device is by thinking a thought, which means a certain brain pattern, so you emit certain signals which the device reads, and then associates with an action. But if we skip the intermediate steps it essentially becomes thought = action. EDIT: So actually in a way it does read your thoughts/brain patterns, the ones that its been trained to recognise.

2. Like I said, there's no reason to associate "slide" with the thoughts/brain patterns associated with up and down.

[quote]In any case, my point is that such an action doesn't make sense at all and would only bring in confusion (and thereby unjustified frustation) to the player.[/quote]

So why did you consider it? There are better ways to implement a platformer "slide" control with the Emotiv headset, such as assigning it its own thought.

[quote]I was under the impression the headset could read multiple signals at the same time (in fact this would be required to support diagonals in the direction-based actions, for example).[/quote]

EDIT: In the video I linked (EDIT: at about 16 to 18 minutes in) Marvin seems to try and make the cube rotate and dissapear at the same time but cannot. Probably because combining "disappear" wth "rotate" generated a different brain pattern (and thus different signals) which the device had not been trained to recognise yet. EDIT: In other words I don't think the device can dynamically interpret combinations of thoughts. On the other hand he didn't have much time or freedom to experiment and I would be interested to see an example of character movement control with the Emotiv headset, or rather character movement plus another action.

Maybe it will be a case of simply thinking the moves we want to make and associating them with character movements and also thinking the moves plus actions we want to make and again associating them with the equivalent character actions. That's the way we learn to multiple things at the same time in RL at any rate. We learn to walk, then we learn to turn while we walk, then we learn to throw something while we walk, and all these probably require different brain patterns.

Share this post


Link to post
Share on other sites
[quote name='forsandifs' timestamp='1302456161' post='4796739'][quote]In any case, my point is that such an action doesn't make sense at all and would only bring in confusion (and thereby unjustified frustation) to the player.[/quote]

So why did you consider it? There are better ways to implement a platformer "slide" control with the Emotiv headset, such as assigning it its own thought.[/quote]
Which was the whole point of my first post. A designer will probably be using a button-based device, and will decide that controls should be mapped based on that, and will usually ignore how it'd work with other devices. Down+jump works fine if you're pressing buttons, but it doesn't work fine if you're using a different input system.

The idea of this topic is to find out a way to work around all this and be able to behave nicely with as many input devices as possible.

[quote name='forsandifs' timestamp='1302456161' post='4796739'][quote]I was under the impression the headset could read multiple signals at the same time (in fact this would be required to support diagonals in the direction-based actions, for example).[/quote]

EDIT: In the video I linked (EDIT: at about 16 to 18 minutes in) Marvin seems to try and make the cube rotate and dissapear at the same time but cannot. Probably because combining "disappear" wth "rotate" generated a different brain pattern (and thus different signals) which the device had not been trained to recognise yet. EDIT: In other words I don't think the device can dynamically interpret combinations of thoughts. On the other hand he didn't have much time or freedom to experiment and I would be interested to see an example of character movement control with the Emotiv headset, or rather character movement plus another action.[/quote]
I think some of the actions actually aren't compatible with each other. I think you can combine move actions with one other type of action, but nothing else (in the case of sliding that would work though). Then again, last time I checked Emotiv was ages ago (while it was still undergoing development), so probably things shaped differently since then.

Share this post


Link to post
Share on other sites
[quote name='Sik_the_hedgehog' timestamp='1302454638' post='4796727']Except it doesn't work that way. It doesn't read your thoughts, but rather it tries to read the same kind of signals the brain would send to the muscles to move them. This works because the brain can treat the entity on screen as an extension of the body. This would mean that in the example I gave, it'd be equivalent to trying to move a part of your body up and down at the same time... you just can't (though no idea how it works if there isn't a physical impediment - maybe the brain would consider it like stretching the entity vertically?).[/quote]

Actually, technically speaking you CAN do just that.

Because muscles can only exert force when contracting, every motion has two opposing sets of muscles (as an example, the upper arm has the biceps, which bends the elbow, and the triceps, which straightens it). Opposing muscles are controlled independently - the brain can direct a muscle to contract with any available amount of force regardless of how had it's directing the opposing muscle to contract. The body part won't move, but the muscles will nevertheless be tensed, and because of the muscle tension, the joint will be rigid instead of slack.

Share this post


Link to post
Share on other sites
OK, who at Khronos was reading my mind?
http://www.gamasutra.com/view/news/34068/OpenGL_Overseer_Khronos_Creating_DeviceSensor_Input_Standard.php

Though I suppose that will only deal with the low level details instead of the high level ones (which is the real issue I was talking about). Still, I guess that's going to make it easier if we can program all devices through a single API, though I wonder how much support will it have.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this