People were really interested in seeing my VR game demo. You can't go wrong with VR combined with hand gesture input, and wizards and zombies. What I did realize (again) is that people really struggled with learning the controls and learning how to move around in my game world, and generally just completely missed what I thought was a prominently placed spell shrine on the right side of the pathway. How do you fix that? Do you want to place a great big floating red arrow pointing people to exactly where they need to go?? I suppose the key concept would be "even MORE visual prominence!" But, how much can you dumb things down before people start complaining that "it's too easy!"? My temptation is to let some people struggle a bit (in the release game).
The highlight of the event was not actually the event itself, but the after party. I got to meet some new people in the tech business and to get reacquainted with people I'd seen before. I bumped into Ryan for about the fourth time (he designed the zombies we're using, which is a fun story for another time), so we chatted a lot more. He has an HTC Vive at his house which he's using for a secret project, but invited me to come over and check it out and offered me some time to try to integrate my wizard game with it. So, this was the second time I got to work with the Vive and it was interesting... Let's go into some detail on that.
The headset feels a lot lighter than the Oculus Rift. The Rift DK2 has this problem where it hurts the bridge of my nose and feels a bit uncomfortable. The Vive feels a lot more comfortable and is lighter, but is also slightly larger in size. There are also some fundamental differences in tracking techniques. The Rift uses a camera which gets mounted on your monitor (or a stand) and it looks at a constellation of IR lights mounted on the headset. The Vive uses one or more laser stations which bathe the room in lasers, and then the headset and hand controllers pick up the lasers via some sensors and can determine their position and orientation. Here's what threw me off: I'm used to quickly putting on my Rift and holding it up to my face to check something. If you do this with the Vive, your hands will cover the sensors and you'll completely lose tracking. Whoops!
The biggest immediate difference is that the Vive is a ROOM scale experience, while the Rift is more suited for a seated experience. This plays out in some new considerations which need to be made.
First, you have to realize that you're now working with three distinct coordinate spaces. You've got the world space (game world), you've got local object space, and then you've now also got room space within reality. This gets funky fast, as I learned. The wizard within our game has his camera set at a fixed position at his eye level. This works great for the Rift, but is wrong for the Vive. The "origin" [0,0,0] for the Vive is located in the center of your living room on the floor. So, I had to make a hackish manual camera translation of (0,0,-75) to adjust for this. Since the Vive is also a standing experience, I realized I also had to compensate for the height of the player vs. the height of the character. I'm 6'1, but my character is about 5'5. So if I'm going to put my eye level at my characters eye level, I need to do another adjustment. What if a 5'0 player plays my character? This means that to properly calibrate my headset for the character, I'd have to ask the player to stand up straight in their living room, measure their height, and then make some relative adjustments to the camera position so that the camera matches the character eye position. What an interesting problem, right??
Then there's the new problem of locomotion. A player can walk around in their living room and the camera has to move with them. So, if you're playing as a character, the avatar the player is representing must also move in the living room as well! This means that the character position needs to move around in local object space and have a 1:1 ratio with room space. Then this translation needs to be made in world space. This becomes interesting in that... you the player can literally sidestep an incoming spell or dodge a zombie swing. If only we could track player foot positions...
There is also a new element of player danger/safety as well. The Vive has a 'chaperone' system which is supposed to help you see your real world boundaries within the VR world. This is great to help you avoid running into real life walls and other non-clippable objects in real life. It's great when you're facing the direction you're moving, but if a zombie is charging at you in VR, you will probably be moving backwards -- and you can't really see behind you, so the chaperone system can't help you as much. Conceivably, someone could walk backwards pretty hard into a wall in real life and get hurt.
The wands which come with the HTC Vive are exactly what I need for my game. The Leap Motion device I've been using just isn't cutting it in terms of performance and reliability within VR. If a zombie is charging at you and you want to throw a fireball at him, you absolutely cannot be fighting against the hardware to get that to happen. The only thing you should be fighting is the zombie which wants to eat you. The Vive controllers are butter smooth and reliably do exactly what you want them to do. They're so reliable that you can forget you're using a controller and they become more of an extension of your arm. I've GOT to get my wizard arms to respond perfectly with the Vive controllers. It's an absolute requirement for a Vive VR experience. When you can move your arms in VR and the wizards arms perfectly match what yours are doing, it is fucking magic.
Next week (Oct 28th) I'm going to be featuring my VR game at the "Sea VR 2015" event in Bellevue, WA. I'm really excited and a bit nervous about it. Everyone who's anybody in Seattle and Virtual Reality will be there, so I need to shine. I'm really, really hoping for good press coverage and some sort of grant or sponsorship deal from a VR hardware manufacturer. To get that, I need to aim to be one of the 'best in show' booths (in terms of content, not booth aesthetics). When you try my game, I want you to say, "This is what all VR needs to be".
I'm not going to be able to get the Vive working in time for SeaVR, so I'm not going to try to integrate it until after the event. I also need my own headset so that I can develop more rapidly and not rely on someone elses' schedule. Ryan said that he could put me in touch with some people at Valve and maybe get me one. I'm hopeful.
I did get to try some of the other VR demos Vive has made infamous. The tilt brush app was alright. I'm not an artist, so it's appeal was a bit lost on me. What I was really, actually fascinated by was their UI experience. One hand holds a brush, the other holds a virtual easel, and you can rotate the hand to get different brushes and color pallets. Probably the most impressive demo was the archery demo: You pick up a bow in your left hand, reach behind your back with your right hand to pull an arrow, then you knock it on your bow, pull your right hand back, and release to let loose the arrow. As you pull the drawstring back, the wands will vibrate proportionate to your draw, and you'll hear a 'creak' sound. The only thing missing would be the physical force of pulling back a bowstring, and I'd be convinced it's real. This is how haptic response needs to work. After launching about 50 arrows, my bow arm got tired and I had to stop. That may be an interesting physical limitation to design game experiences around.
One of the guys in my office said that he'd be happy to shoot six minute videos of our VR production and players who try out or VR game. I really want to capture some of the responses players get when they play our game, so that would be great exposure and marketing for our game. If you get a VR headset, my game should be one of the 'must have' games in your library.
Anyways, I'm excited for the future. My sole worry is funding.