Eye tracking for cursor in rpg

Started by
41 comments, last by alh420 8 years, 1 month ago
Using eyetracking for cursor can make games very dynamic and exciting. Players can react quickly without the hassle of where to point the cursor using a mouse. With this little enovation it brings next gen games to a new level. It has a tremendous potential to create a very dynamic gameplay(es. RTS Rpg).

P.S.
Im just new here,[deleted by moderator]. I am very good In brain storming and design concept.
Advertisement

I don't know... I have the feeling that you need to be very careful what you use eye tracking for.

Fact 1: eyes are seldom in rest. They are also hard to 100% control. If you concentrate on something but get distracted, eyes often jump to something else for a second to check it out without you thinking about it. Hands don't do that (as much).

Your player now needs to concentrate his eyes on the action even more than ever before. Which is very tiring.

Fact 2: The Mouse evolved over many years... it is designed the way it is currently for a reason. We started of with a "Mouse" you needed to move to point to were you wanted to click, and then click a button on the keyboard. Having both functions in a singular device made it so much more intuitive.

Unless you click with a blink, we would step back to a less intuitive solution with eye tracking for controlling the mouse cursor.

Fact 3: Accurate control of a cursor takes a lot of training to do. Additionally, with a good and accurate mouse, moving the cursor accuratly is already pretty quick with a Mouse. How exactly would eye tracking improve selection speed in games?

Eye Tracking used for anything else than as a auxiliary tool to improve other systems (moving the view for in-cockpit games or shooters, moving the screen in RTS games, improving the VR expierience) will most probably NOT improve gameplay. It will feel off and distracting to most people.

Same could be said about the direct Brain-Computer interface... as cool as it sounds to "think" your commands without having to move a finger, that is not how the real world works... thus it will feel quite off for most people.

And that is again before going into how intrusive it will be for the expierience, or how tiring it will be to constantly focus your brain on only thinking about the commands you want to give, and nothing else.

VR might be the next gen of games... might. Depends on how it develops from the humble first gen devices that should come out this year. Other than that, traditional games played in 4k with M+KB or Gamepads sounds like pretty much the next gen until VR really takes off.

Welcome to the Forum.

If you are looking to team up, use the classifieds section of the forum (in the title bar menu, second item from right).

Also, you would do good to bring more to the table than brainstorming and ideas... people love to team up with good programmers or artists. If you are really able to prove your worth in anything else, some might still consider you. If you are able to whip out good audio effects and music, or design levels in an engine, or anything else that helps the team move their project forward, that is also cool.

Just don't be the "idea guy" that eats up a lot of time from the other team members with his ideas, but never contributes anything tangible to the project... nobody wants somebody like that on his or her project.

Of course, if you meant with "design concept" that you not only have ideas about the game design, but also put these on paper, work with the programer and artists to make sure everything fits the game design, work in the engine editor to test your design ideas and flesh them out, and balance everything in the end, now that IS a valuable skill.

Its just harder to sell than programming or art. Everyone thinks he or she is a game designer. Nobody thinks he or she is an artist if he or she cannot draw to save his/her life, or think he or she is a programmer without having an idea what a programming language is.

Eh, I think eye tracking could be handy, but it would also need to be really hammered on to get the kinks out. I'd love to have windows give focus to whatever I'm looking at. I do hate it when I start typing only to realize that I don't have focus on the window I'm looking at. That said, there are moments where I do want to keep focus on a window as I look at something else, and glances to somewhere else shouldn't toss focus. Some of that is just heuristics, don't change focus if the eye doesn't come to rest on a position/window/rectangle longer than X amount of time. Other times we might need a method to lock focus on something... the double blink perhaps =)

Either way, some training of users would have to happen.

Eh, I think eye tracking could be handy, but it would also need to be really hammered on to get the kinks out. I'd love to have windows give focus to whatever I'm looking at. I do hate it when I start typing only to realize that I don't have focus on the window I'm looking at. That said, there are moments where I do want to keep focus on a window as I look at something else, and glances to somewhere else shouldn't toss focus. Some of that is just heuristics, don't change focus if the eye doesn't come to rest on a position/window/rectangle longer than X amount of time. Other times we might need a method to lock focus on something... the double blink perhaps =)

Either way, some training of users would have to happen.

As one of those programmers whose typist skills fails them from time to time and have to look down I would certainly not like that too much.

Quick glance to the keyboard just to reassure your fingers are resting on the right keys -> your window just lost focus and your texting your supersecret code to a stranger that happened to pop up on your skype because he/she wanted to be added...

I could see it work in special cases. But it would need to have a lot of smart software behind it to drop all the jerkiness of eye movement, ingore lots of random stuff like looking out of the window, looking down to the keyboard, being distracted by an E-Mail popup and whatnot. Else it would hamper productivity a lot.

I am sure one day, there will be a new input method that DOES better the Mouse and Keyboard for general computing. For playing games, for me a very good Gamepad already does that for certain genres of games (now every PC Game should have mandatory XBox Controller support!).

The brain mouse from some years back clearly wasn't it. The leap motion obviously neither. The Touchscreen, for all it is worth, is an incredibly niche crap interface for low accuray use cases like typing a number into a smart phone.

The Digitizer is bettering the Mouse in all Art creation related subjects... still kinda niche, even though clearly here to stay.

Eye Tracking is to me on its own still an Input device fad that will not stick. Combined with a VR Goggle that has movable screens or a 4k+ screen and movable lenses, eye tracking could lead to a much improved 2nd get VR expierience. Other than that, I fail to see a use case where eye tracking would really add immense benefit over other types of input.

On the subject of input device fads, I just bought an XBox One Kinect and adapter on the cheap for my newly upgraded Windows 10 PC (needed a reason to go to Windows 10, and wanted to test markerless moCap)... apart from being excited about trying the low cost moCap solutions, I would love to test controlling windows with the Kinect.

Its kinda gimmicky, and not REALLY viable, but being able to control Spotify or youtube from the other side of the room without a remote when working on my other computer does sound cool. In a weird, impractical, nerdy way smile.png


As one of those programmers whose typist skills fails them from time to time and have to look down I would certainly not like that too much.

Quick glance to the keyboard just to reassure your fingers are resting on the right keys -> your window just lost focus and your texting your supersecret code to a stranger that happened to pop up on your skype because he/she wanted to be added...

I could see it work in special cases. But it would need to have a lot of smart software behind it to drop all the jerkiness of eye movement, ingore lots of random stuff like looking out of the window, looking down to the keyboard, being distracted by an E-Mail popup and whatnot. Else it would hamper productivity a lot.

Eh, I think having window change focus based upon where your eyes rest would be enough to eliminate much of those issues. IE, don't change focus unless the user continues to look at the new window for 500 ms (or a second or two -- numbers all pulled out of the ether) Obviously if your looking out the window, you're not looking at a new window, ergo focus wouldn't change. Could even have a graphical effect occur to show that if you continue looking at a new window, it's going to swap focus. I also think noise is a non-issue when we're talking about looking at something the size of a window. Buttons may require some fine tuning, not sure, there is existing tech out there for eye tracking:

, but I haven't messed with it.


Eh, I think having window change focus based upon where your eyes rest would be enough to eliminate much of those issues. IE, don't change focus unless the user continues to look at the new window for 500 ms (or a second or two -- numbers all pulled out of the ether) Obviously if your looking out the window, you're not looking at a new window, ergo focus wouldn't change. Could even have a graphical effect occur to show that if you continue looking at a new window, it's going to swap focus. I also think noise is a non-issue when we're talking about looking at something the size of a window. Buttons may require some fine tuning, not sure, there is existing tech out there for eye tracking:

Well yeah, given you wrap your eye tracking in a software layer that is able to tell "intended eye movement" from noise, then yeah, might be working.

If it is going to add to productivity is questionable though... depends on the user really.

Some are extreme Multitaskers with 1000 of open windows... yet what good is that eye tracking solution if windows are buried under a full screen window? Given you don't come up with some clever "eye tracking shortcuts", eye tracking will not solve that problem.

Some people like their Desktop clean and tend to have only windows open they need at the time. There is not much eye tracking could do for these people.

Then there are the people working with multiple Desktops, that like to alt+tab between windows and stuff like that.

I mean there is a whole bunch of "innovative new input methods" that came up in the last few years and died down again. I am sure at some point somebody will improve upon them and make them more usable. If a more accurate brain mouse, or a better leap motion controller will REALLY transform the way we work and play on the PC, I doubt though. What could they possibly add? Besides fine control when doing art creation, there is not much you could improve upon the mouse and keyboard for general work. And for art creation, the digitizer is hard to beat.

Lets be honest: If you want true VR, there is no way around motion controllers, eye tracking, and at some point maybe output devices for the other senses. But you don't do that in VR to improve accuray. Its questionable if moving your whole arm is ever as accurate as just moving your fingers with a digitizer.

You do it because of immersion. Because VR works best when it simulates the physical world.

Outside of VR, all these Input and Output devices will remain niche. Do your really want to smell your e-mails? Control spotify with a dance move? Think about what you want to put in your essay and let the brain-computer interface put it on virtual paper?

Most probably not. There will always be devices that do serve a very good job for niche applications though:

3D Mouse / Space Navigator: hard to beat to navigate your 3D viewport. Really transforms the way you do 3D modelling.

Kinect for MoCap: if the software is up to scratch, that might bring MoCap to the masses.

Eye Tracking for usability testing: This is where eye tracking is hard to beat today. Don't ask the user where he was looking, measure it.

I am sure over time we will find out more niches where alternative input methods will be useful. I doubt any of these will replace M+KB in general computing though, and I doubt it will really become mainstream in non-VR gaming.

There is just too little productivity gain for the steep learning curve and the device cost in the first case. And there is to little use and gain of immersion besides niche games in the second.

I am sure SOME gamers would love to tie ingame head movement to eye tracking (or head tracking)... but for most, that will be extremly disorientating, and not adding much, especially as long as the average gamer has small as hell 24" and 27" screens at home... no much screen real estate to move your eyes, thus either the effect is minimal, or the rotation speed is way to extreme.

One interesting use of headtracking + eye tracking, if combined with a 3D screen, is not for cursor movement, but for camera movement.

It could make your screen feel like an actual window into a game world and give a pretty awesome 3D experience that you can't get today with just 3D screens (with or without glasses)

If that is useful or not, would depend on the style you're after, but I think it could be interesting, specially if you step outside of thinking it should be used for something first person...

Nothing that has a market yet, but maybe in a few years :)

One interesting use of headtracking + eye tracking, if combined with a 3D screen, is not for cursor movement, but for camera movement.

It could make your screen feel like an actual window into a game world and give a pretty awesome 3D experience that you can't get today with just 3D screens (with or without glasses)

If that is useful or not, would depend on the style you're after, but I think it could be interesting, specially if you step outside of thinking it should be used for something first person...

Nothing that has a market yet, but maybe in a few years smile.png

For a VR Goggle, yes. That is basically one of the components still missing from first gen VR devices like the Occulus (Eye Tracking, not head tracking).

For a very large screen filling more than your field of view, yes (altough there are issues with resolution still if 4k is blown up to such a screen, and geometr unless the screen is curved).

for a 24" screen at a normal viewing distance, this doesn't add much. Either you have a very quick ingame camera rotation linked to your head/eye movement (which can be very distracting), or you will not see much rotation at all (at which point there is the question "why bother" as long as head/eye trackers are still expensive peripherals).

I still think these kind of devices are a good enhancement for VR, but extremly niche for a more traditional gaming expierience.


for a 24" screen at a normal viewing distance, this doesn't add much

I think it could add quite a bit of wow-factor also on a game like a platformer.

It would make the scene have actual depth in a much more realistic way then you get with just a 3D screen.

Hard to explain properly, this old demo I was involved with shows the effect somewhat:

Not as good as it could be though :)

I think it would also work pretty nicely in a 3rd person game.

Not adding much gameplay, just more immersion, and something you could add if eye tracking and head tracking was common...

Im just new here,[deleted by moderator]. I am very good In brain storming and design concept.


Please use the Classifieds. Good luck.

-- Tom Sloper -- sloperama.com

This topic is closed to new replies.

Advertisement