The format is pretty simple: I had to give a five minute powerpoint presentation on what we're doing. Five minutes is a really short time to talk about what you're doing and I found it's a really restrictive time limit to cram everything into. Obviously, a lot will not make the final cut, and when the panel gives feedback, they'll tell you a lot of stuff about what you had to leave out. But, that's okay. I learned a lot.
The most valuable lesson I think I learned is that I'm not super good at projecting confidence 100% of the time. If people ask me questions about things I haven't spent a lot of time thinking about, or just something I don't know, then I end up using the language of someone who is uncertain, like
"I hope..", "I think...", "Maybe...", "I'm guessing..."
These are words and phrases that I need to purge from my vocabulary. If people are going to believe in me enough to invest in me, then they need to feel like I'm very certain about what I'm doing and I've got it all figured out. Honestly, I believe I have a pretty good grasp of the industry and where its going and how to be a player, but when it comes to the business and finance stuff, maybe not so much. A lot of it is brand new to me. My natural skill set is programming, but I do want to get a lot more well versed in start ups and entrepreneurship. I think I'd be really good at it.
Anyways, on to the Oculus Rift DK2 and Leap Motion.
It took half a day to get it all setup. My thought is that if we build a VR game, the actual VR device we use for development doesn't matter. The Unreal Engine is the platform we're building on, and the engine developers are the ones who need to worry about the technical details behind the hardware integration for the various VR platforms. Therefore, if our game works on Oculus, then it should also work great on the HTC Vive whenever that eventually comes out.
I was playing this VR version of a mech warrior clone. It is very awesome. You drive a mech around on what looks like the surface of Mars. The graphics aren't stellar, but the developers have done a very good job with integrating VR to create a sense of immersion. You are a pilot which is sitting in the cockpit of a mech. You can look around at the various screen displays within the mech cockpit. These are your "user interfaces" which tell you about the status of your craft. You move around with the keyboard and fire with the mouse. What's really surreal is that when you put your hands in these positions, then look at the arm positions of the pilot, you almost feel like that pilots body is your own. You can even see the pilots legs and almost think they're your own. This is certainly something we're going to have to do with the magicians in our game (no invisible or partial bodies!). The mechs also have jump jets. You can use it to lift off, and then steer it a bit with your WASD keys. To my surprise, I found that I started getting a sense of motion sickness and nausea from this. I think it's because my eyes were telling me that my body is supposed to be moving, but my body was not feeling the forces of that movement and therefore, there was a deep lizard brain mental disagreement on what's going on. I'll have to be very careful about designing the game mechanics to avoid causing these feelings of motion sickness...
The leap motion controller has some problems of its own, and problems with integrating with a head mounted display device. The Leap Motion team has been spending most of their time and effort in building a tight integration between their device and Unity. Unreal Engine hasn't gotten much attention until recently. Leap Motion released a plugin a few weeks ago with the latest version release of UE4.7, but there are some serious latency and tracking issues. But it kind of works. I'm hoping that the latency and tracking issues will be resolved with the next update.
The motion tracking device also has a lot of trouble working with the Oculus Rift head mount. I think there's infrared interference with the rift camera sensor, but I'll have to test it a bit more. As it is right now, the Leap Motion device doesn't work well at all in UE4. That kind of puts a damper on my idea to launch fireballs by using your hands. I'm really hoping that Valve's Lighthouse hardware will give much better hand and body tracking than the Leap Motion device.
The other issue is that my arms get tired! I can hold them up for about five minutes before I need a rest. I'm in decent athletic shape, but maybe my end users won't be. Maybe this is a problem of building up physical endurance / stamina? If I ignore this problem, it may go away on its own. But, that does mean that I'd have to make sure that game battles don't last longer than 15 minutes each. On the positive side, perhaps it will change how physically active people get while playing video games? Gaming could become a lot more physically involved and require a minimum level of athleticism to do well? Just imagine people saying, "I lost fifteen pounds last month by playing Master of Mages, and I didn't even notice!"
Anyways, the focus for the remaining week is to build a sweet game demo which takes advantage of VR. It's going to be super simple: You launch explosive fireballs at a pile of crates. I'm sure it'll be harder than it sounds, but it's like the "hello world" for a VR game with the various display and input devices.