I've been doing virtual reality game development for a year now. In terms of relative industry time, I'm old and experienced. I wouldn't call myself an expert though -- there are many far more knowledgeable people than I and I defer to their expertise.
This article covers many of the unique design challenges a VR developer will face within a roomscale play environment, how many VR developers are currently handling these design problems, and how I've handled them. There are a lot of general VR design choices which can impose some big unforeseen technical challenges and restrictions. I share my lessons learned and experiences so that anyone else considering working in VR can benefit from them. In the interests of being broadly applicable and useful, this article focuses more heavily on the design challenges and concepts of an implementation than the math and code implementation details.
In the interest of getting deeper into the technical details quickly, I'm going to assume that everyone here knows what virtual reality is and generally how it works. If you don't, there are plenty of articles available online to give you a basic understanding.
Commonly used terms
Room Scale: This is the physical play area which a person uses to walk around in. Often this will be somebody's living room or bedroom.
World Space: This is an in-game coordinate system for defining object positions relative to the center of the game world.
Avatar: This is the character representation of the player within the virtual reality environment.
Player: The human being in physical space controlling an avatar within VR.
Motion Controller: A pair of hand held hardware input devices which are used to track hand positions and orientations over time.
HMD: Stands for "Head Mounted Device", which is the VR goggles people wear.
The priorities of a VR developer and designer
1. Do NOT make the player sick.
Motion sickness is a real thing in VR. I've experienced it many times, and I can make other people experience it as well. It is nauseating and BAD. Do everything you can to avoid it. I have nothing but contempt for developers who intentionally try to cause people to get motion sick, even if its a part of their 'game design'. I have a working theory on what exactly causes the motion sickness, but I'll start by talking about a few popular VR experiences and what caused the motion sickness.
The first VR game I tried was a mech warrior style game called "
">VOX Machinae", where you are a pilot driving around a mech. With the original mech warrior, you had jump jets which allowed you to accellerate into the sky for a short period of time. This game copied that idea. I have pretty good spatial awareness and can handle small bursts of accelleration, but in this game, you could be airborne for many seconds and you also could change your velocity in midair. Then you land and resume walking. All of these things caused me to feel motion sick for various reasons:
The initial jump jet thrust into the air was no problem for me. Changing my flight direction in midair was a huge problem though. When you're in flight, you don't have anything to use as a visual reference for your delta velocity. You see the ground below you moving slightly and your brain tries really hard to precisely nail down where it's spatially located, but there isn't any nearby visual reference to use for relative changes. If the designer insists on using jump jets, what they should do is place small detail objects in the air (dust motes, rain drops, particles, etc). Avoid letting players change velocity in midair.
The second part which caused me discomfort was the actual landing, when the mech hits the ground. I don't remember exactly, but I think the mech had a bit of bounce-back when it landed, where the legs would insulate most of the force of the landing. Visually, you'd get a bit of a bob. Remember that braking is also a form of accelleration, and accelleration is a big culprit of motion sickness.
One other notable VR experience was this game called "The sightline chair". It was supposed to be a comfortable experience, but the ending was stomach churning. The gist of the game is that you sit in a chair and look around. The world changes when you're not looking at it, so you gradually move from a forest to a city, to who knows what. At the very end of the experience, you are sitting in a chair at the top of a sky scraper scaffolding, on a flimsy board. I don't necessarily have a fear of heights, but looking down was scary because I knew what was about to happen next: I would fall. The scaffolding collapses, and not only does the chair you're in fall down, it SPINS BACKWARDS. Do NOT do this. NO SPINNING THE PLAYER! I didn't even let the falling sequence finish, I just closed my eyes and immediately ended the "game".
Someone had published a rollercoaster VR experience which they built rather quickly in Unreal Engine 4. I knew it would make me motion sick, but I was studying the particular causes of motion sickness and had to try it out for science (VR devs suffer for their players!). The rollercoaster experience took place in a giant bedroom and followed a track with curves, dips, rises, and other motions. Essentially, they are taking your camera and moving it on rails at different speeds. So, I was expecting accellerations and thus sickness. The first time I went through the rollercoaster, I felt somewhat bad afterwards. I then did it again, and felt much worse. The lesson here is that motion sickness is a compounding, accruing effect which can last long after the VR experience is over. If you have even a little bit of stuff which causes people to get motion sick, it will build up over time and people will gradually start feeling worse and worse.
I think it's also worth noting that a VR developer will need a very high end computer. I went to a hackathon last summer and had a pretty nauseating experience. First, I drank one beer and was slightly buzzed. First mistake. Then, I went to see some enthusiast guys demo on an Oculus DK1, which is an old relic by todays standards. Second mistake. I decided to stand up. Third mistake. He ran the demo on a crappy laptop. Fourth mistake. Then he ran some crappy demo with lots of funky movements and bad frame rates. Fifth mistake. I could only get about half way through before I started feeling like I was about to fall over on my face, so I had to stop. Don't do any of this! It's not worth it.
Anyways, let's talk about the game I'm developing and what I discovered to be trouble spots. The game begins at the top of a wizard tower. You can go between levels in the tower by walking down flights of stairs. I have two types of stair ways.
Stairs which descend straight down
Stairs which curve and descend
The curved stairs generally cause SPINNING and spinning generally causes motion sickness. The stairs which take players between levels of the tower are okay, but there's a chance that players can jump down instead of taking one stair at a time (which potentially causes motion sickness). So, to protect the player from themselves, I put banister railings and furniture to block the player from jumping down directly. They could still try if they were determined, but I did my part to promote wizard safety and broken knees. I find it helps to imagine that an occupational health and safety inspector is going to come look at the environment I built and tell me where I need to put safeguards to protect the player from unintentional hazards. Of course, that's not going to actually happen, but it gets you thinking about your world as if people actually had to live in it and how they'd design it for their own health and safety.
The other surprising cause of motion sickness is the terrain and movement speed of the player. Your terrain should generally be SMOOTH. From a level designers standpoint, it's tempting to add a bit of roughness to the terrain to make it look less artificial, but every bump in the terrain causes the player VR headset to go up and down, and this is a small form of accelleration, which, causes accruing motion sickness over time. Smooth out your terrain with a smoothing brush. If you want hills or valleys, make the slopes gradual and smooth. The faster a player travels over any object which causes the camera to go up and down (stairs, rocks, logs, etc) the more likely they are experience motion sickness and get sick. To hide the smoothness of your terrain, use foliage and other props to cover it up.
Keep in mind that when you are building a VR game and designing things within it, you will be building up a tolerance for motion sickness susceptibility. You become a bad test subject for motion sickness. Try to find willing guinea pigs who have very little experience with VR. When you find these willing guinea pigs, be very sure to do a pre-screening of their current state! And, very importantly, keep a barf bag nearby! We had two incidents during testing:
A player had been feeling sick and nautious before playing our VR game. The VR exacerbated the sickness and he had to stop.
A player had drank bad juice or something and it wasn't agreeing with her stomach. She didn't say anything about it, but had to stop the VR game and ran out and puked into a garbage bag.
Was it our game which caused sickness, or were these people feeling sick prior to playing? It was impossible to isolate the true cause because we weren't conducting a highly controlled test. This is why you pre-screen people so that you can control your studies a bit better.
At the end of the day, VR developers owe it to their loyal players to make their VR experience the best, most comfortable experience possible. Uncomfortable experiences, no matter how great the game is, will cause people to stop playing the game. Nobody will see your final boss monster or know how your epic story ends if they stop playing after 5 minutes due to motion sickness.
2. Design to enhance player immersion, avoid breaking it.
Virtual Reality is not a new form of cinema and it's not gaming. VR is also not a set of fancy, expensive goggles people wear, though it is a necessary component to creating virtual reality. Virtual Reality, and the goal of a VR developer, is to transfer the wearer and their sense of being (presence) into a new artificial world and do our best to convince them that it is real. Our job is to convince players that what they're seeing, hearing, feeling, touching, and sensing is real. Of course, we all know it isn't 'real', but it can seem to be very convincing, almost to a frightening level of immersion. So, a player loans to us many of their senses and their suspension of disbelief, and in exchange, we give them a really fun, immersive experience. The second most important goal of a VR developer is to create and maintain that sense of immersion, that feeling of actually being somewhere totally different, of being someone else entirely different.
Generally speaking, the virtual reality players game experience should match their expectations of how they think the environment should look and react. If you create a scene with a tree in it, players are going to walk up to that tree and look all around it and expect it to look like a tree from every angle. That tree should be appropriately scaled to the player avatar, and should not glitch in any way. Remember that you don't control the camera position and orientation, the player does. So, consider the possibility that the player will look at your objects from any angle and distance. If the object mesh is poorly done, immersion is broken. If the object texture is wrong, immersion is broken. If the object UV coordinates and skinning is wrong and you've got seams, immersion is broken. If players can walk through what they expect to be a solid object, immersion is broken.
You can also have immersion breaking sounds. You can also have narrative dialogue which "breaks the fourth wall". There isn't a definitive list of do's and don'ts for immersion in VR. My approach is to use more of an imaginative heurustic, where I try to close my eyes, become the character standing in this world, and imagine how it needs to look, sound, and behave. Players will try to do things they expect won't work, so a big part of "VR Magic" is to react to these unexpected player tests of VR. For example, if you have a pet animal, players will try to play with it and pet it. If you pet it, the animal should react positively. Valve nailed it with their new mechanical dog. You can pick up a stick in the game world and throw it. The mechanical dog will bark and run after it and bring it back. Then you can roll the mechanical dog over and rub its belly, which causes its tail to wag and legs to wiggle in a very cute way.
Whatever you do build in VR, test it! Test it as often as you can, and get as many other people as you can to try it out. Watch carefully how they play, what they do, where they get stuck, how they test the environment response to their expectations, etc. No VR experience should ever be built in isolation or a vacuum.
3. Hit Your HMD Frame Rate
For a lot of my development, I've mostly ignored this rule. Particularly for the Oculus Rift. Their hardware can smooth between frames, so if you drop below the target frame rate, the game is still playable on the Rift. I felt comfortable at 45 frames per second and immersion didn't break. However, switching to the HTC Vive was a different story. Their hardware has a screen refresh rate set at 90 hertz, so if you are running anywhere below 90fps, you WILL see judder when you move your head around, and this will break immersion and cause motion sickness to build up. When you're building your VR game, profile often and fix latency causing issues in your build. Always be hitting your framerate.
Some players are much more sensitive to framerate latency, so if you aren't hitting your frame rate and they're reporting motion sickness, you can't rule out framerate as being a cause of it, which causes problem isolation to be much more difficult.
4. Full Body Avatars
I may be in a very small camp of VR developers advocating for full body avatars, but I think it is super valuable to producing an immersive and compelling VR experience. This is particular to first person perspective games, so platformer style games may safely ignore this. The goal in creating a full body avatar is to give the player a body to call their own. This can become a powerful experience and creates lots of interesting opportunities. It is very magical to be able to look down and see your arm, wiggle your fingers in real life, and see the avatar wiggling its fingers exactly the same way. To build a sense of self-identity, we place a mirror within the very beginning part of our VR game. You can see your avatar looking back at you, and as you move your head from left to right, up and down, you see the corresponding avatar head movements reflected in the mirror. What's particularly interesting about immersive full body avatars is that the things which happen to the avatar character feel like they're happening to your own self, and you can experience the world through the perspective of a body which is very different from your own. If you're a man, you can be a woman. If you're a woman, you can be a man. You can change skin color to examine different racial and cultural norms. You can become a zombie and experience what its like to see the world through their eyes. etc. You're not just in someone elses shoes, you're in their body.
Beyond creating character empathy, this also creates a very exciting opportunity to create first hand experiences which are impossible in any other form of media: Imagine a non player character coming up to you and giving you a nice warm hug. The only reason you would believe it wasn't real is because you can't feel the slight constriction around your player torso which real hugs give -- but emotionally and from the perspective of personal intimate space, they are the very same. When you possess a full body avatar, you also build a sense of 'personal space', where if something invades that bubble, you feel it yourself -- if its something dangerous or intimidating, you reel back in fear. If its something familiar and intimate, you feel warm and happy.
Aside from just being a form of self-identification, a full body avatar also works as a communication device between the player and the game. If the player is walking, they should not only see the world moving around them relative to their movement, but they should be able to look down and see their feet moving in the direction of travel. If the player is hurt, they can look at their arms or body to see their damage (blood? torn clothing? wounds?). Our best practice is to keep the body mesh seperate from the head mesh. The head mesh is not rendered for the owning player, but other players and cameras can see it. The reason you want to hide the head mesh from the owning player is because you don't want the player to see the innards of their own head (such as eye balls, mouth, hair, etc). If you want though, you can place a hat on the players head which enters into their peripherial vision.
5. Tone down the speed and intensity
VR games are immersive and players feel like they're physically in your world. This alone magnifies the intensity of any experience players are going to have, so if you're used to designing highly intense, fast paced game play, you're going to want to dial it down to at least half. I think that it's actually possible to give people PTSD from VR, so be careful. You also want to be mindful of common fears people have, such as spiders, and keep those to a minimum. Modern games also tend to have a very fast walk and run speeds designed to keep players in the action. It feels good and polished on a traditional monitor, but if you do this in VR, you feel like you're running 40 miles per hour.
Horror and psychological thrillers are going to be popular go to genres for VR game studios trying to get a strong reaction out of players. However, I will forever condemn anyone who puts jump scares into their VR game. If you do this, I will hate you and I won't even try your game, no matter how 'good' people say it is. Jump scares are a cheap gimmick used to get a momentary rise out of someone and they're used by design hacks who've got nothing better up their designer sleeves. Jump scares are about as horrible as comic sans or the papyrus font. They're not fun, they're not scary, and they're not interesting. Don't use them.
When it comes to locomotion speeds in a room scale environment, players are going to be walking around at a natural walking pace which is comfortable for their environment and dimensions of their play area. Generally, this is going to be quite slow compared to what we're used to in traditional video games! You may feel tempted to change the ratio of real world movement to game world movement, but through lots of first hand testing, I can assure you that this is actually a bad idea. We NEED our physical movement and our visual appearance of movement to have a 1:1 ratio. This sets the tone for what the movement speed of the avatar should be if you're planning to artificially move them as well (through input controls): The avatar should move at a speed which is as close to the players natural, comfortable walking speed. This value can be empirically derived through play testing. The technique I used to get my value is as follows:
Make sure your avatar movement has a 1:1 ratio with player movement. If the player moves forward 1 meter, the avatar should move forward 1 meter as well.
Start at one end of the play space and physically walk to the other end, and measure the approximate amount of time it took. This is your own walking speed.
Measure the distance your avatar covered over this period of time. This is the distance your avatar needs to cover if its being moved through input controls. Divide this distance by the time it took you to cover this distance in your room environment, and you'll have an approximate speed for your artificial movement.
The speed of your avatars movements and player walking speeds will have a huge impact on your level design, game pacing, monster movement speeds and corresponding animations. You'll generally want to make your levels less expansive and tighter, so players aren't spending 15 seconds walking uneventfully through a corridor or along a forest path. The walking speeds and awareness of surroundings also puts a huge limit on how much action a player can handle simultaneously. With traditional games, we can constantly be walking and running around while shooting guns and slinging spells, and using the mouse to whip around in an instant, but with room scale VR, we naturally tend to either be exclusively moving or performing an action, but rarely both at the same time. From user testing, generally people can only effectively focus on about one or two monsters simultaneously in VR, and even this delivers a satisfyingly intense experience. Adding additional monsters would only overwhelm players and their capacity to handle them effectively, so take these considerations into mind when designing monster encounters, particularly if monsters can approach the players from all directions.
The last point to consider carefully is avatar death and the impact it has on the player. In traditional forms of media, we're a bit more disconnected from our characters dying because it's not happening to us personally. When it happens to you yourself, it feels a bit more shocking and we tend to have a bit more of an instinctual self-preservation type response. So, the emotional reaction to our own death is a bit stronger in VR. The rule of thumb I'm using is to lower the intensity of the death experience proportionate to its frequency (via design). If death rarely happens, you can allow it to be somewhat jarring. If death happens frequently, you want to tone down the intensity. Whatever technique you use for this, keep a critical eye towards preserving immersion and presence.
Unique Roomscale VR Considerations
I've found that designing and building for seated VR is much more simple than standing or room scale VR. However, room scale VR is much more fun and immersive, but that comes with an added layer of complexity for things you have to try to account for and handle. I'll briefly go through the challenges for room scale VR:
Measure the height of the player
Players come in a range of heights, from 130cm to 200cm. If a player is standing in their living room with a VR headset, their height should have no bearing on the height in game. So, before a player gets into the game, you should have a calibration step where the player is instructed to stand up straight and tall, and then you measure their height. You can then take the height of the player and compare it against the height of the player avatar and figure out an eye offset and a height ratio. The player will see the VR world through the eyes of their avatar, and if you design the world for the avatar, you can be assured that everyone has the same play experience since you've calibrated the player height to the avatar height. You also know that if a players head goes down 10cm, the proportion of their skeletal movement is going to differ based on how tall they are, and you can use this information to appropriately animate the avatar proportionate to player skeletal position.
Derive the player torso orientation
This is a surprisingly hard problem. If you have an avatar representing the player (which you should!), then you need to know where to place that avatar and what pose to put that avatar in. The VR avatar should ideally match the players body. To find the player torso, you have three devices which return a position and orientation value every frame. You know where the players head is located, where their left hand and right hand are, and the orientations for each one. To find the center position and rotation of the player torso, keep in mind that a player may be looking over their left or right shoulder and may have their arms in any crazy position. Fortunately, players themselves have some physical constraints you can rely on: Unless the player is an owl, they can't rotate their head beyond their shoulders, so the head yaw direction is always going to be within 90 degrees of their torso orientation. As long as the player keeps the left hand motion controller in the left hand, and the right in the right hand, you can also make some distinctions about the physical capabilities of each motion controller. If the pitch of the motion controller is beyond +/- 90 degrees, then the arm is bending at the elbow and it is probably upside down. You can get the "forward" vector of both motion controllers, whether they're upside down or not, and use that direction as a factor for determining the actual torso position and orientation. The other important consideration to keep in mind is that a player can be looking down or tilting their head back, or rolling their head left or right, all of which change the position of the head relative to the torso. I tested my own physical limits at the extremes, captured my head roll and pitch values and the position offsets, and then used an inverse lerp to determine the head offset from the torso position. This assumes of course, that the player head is always going to be attached to their body. You can also add in some logic to measure the distance of the motion controllers from the head mounted display device. If the controllers are outside of a persons physical arm length, you can assume that they aren't actually holding the motion controllers and can use some logic to help the player see where these motion controllers are in their play space.
When you know the forward direction of the motion controllers and the head rotation, you can get a rough approximation of the actual torso position. I just average together the head yaw and the yaws of the motion controllers to get the torso yaw, but this could probably be improved a lot more. The torso position is going to be some fixed value below the head position, accounting for the head rotation offsets and the players calibrated height and their proportions.
Why is the player torso orientation important? Well, if the player walks forward, they walk in the direction their torso is facing. You want to let players look over their shoulder and look around while walking straight forward.
Modeling and Animation in VR:
If you're using a fully body avatar to represent the player, your animation requirements are going to be a lot lighter. A lot of the characters bones are going to be driven by the players body position. You'll almost certainly always have to create a standing, walking and running animation and do blending between the three animations. There are two VERY important bits to keep in mind here:
1. Do NOT use head bobbing. If the camera is attached to the player avatar in any way, this causes unnecessary motion, which causes sickness.
2. The poses should have the shoulders squared and the torso facing forward at all times. Don't slant the body orientation. The poses should ideally also have the arms hanging at the sides. The arms will stay out of sight until the player brings the motion controllers upwards. The reason you don't want the torso to be slanted is because when the player reaches both arms straight out, the avatar arm lengths need to match the player arm lengths -- if the avatar torso is slanted, one shoulder will be further back and one arm will be further out than the other.
Since you know the position of the players hands and head, you can use inverse kinematics (IK) to drive a lot of the avatar skeleton. Through trial and error, we found that the palm position is slightly offset from the motion controller position, but this varies by hardware device. We use IK to place the palm at the motion controller position and that drives the elbow bone and shoulder rotations.
You'll also have a problem where players will use their hands to move through physically blocking objects in VR. There's nothing in physical reality stopping their hand from moving through a virtual reality object, but you can block the avatars hand from passing through it by using a line trace between the avatars palm position and the motion controllers palm position, then setting the avatar palm position to the impact point. So, rather than having a hand phase through a wall or table, you can have these virtual objects block it in a convincing way.
Artificial Intelligence in VR:
One surprise we've found is how much "life" good artificial intelligence and animation brings to characters in game. This is an important part of creating a convincing environment which conforms to the expectations of the player. When writing and designing the AI for a character, I try to put myself into the position of the character and think about what information and knowledge it has and try to respond to it in the most intelligent way possible. There will be *some* need for cheating, but you can get a lot of immersion and presence mileage out of well done AI and response to player actions. If you're having trouble figuring out the AI behaviors for a particular character, you can take control over that character and move around the world and figure out how and when to use its various actions. This can also lead to some interesting multiplayer game modes if you have the extra time.
You actually don't want to get too fancy with sounds. You want to record and play your sound effects in MONO, but when you place them within your VR environment, you want those sounds to have the correct position and attenuation relative to the players head. Most engines should already handle this for you. One other consideration you'll want to look at is the environment the sound is playing in. Do you have echos or reverberations from sound bouncing off the walls in VR? You'll also want to be careful with your sound attenuation settings. Some things, like a burning torch, can play a burning, crackling sound, but the attenuation volume could follow a logarithmic scale instead of a fixed linear scale. A lot of these things will require experimentation and testing.
One other important consideration is thinking about where exactly a sound is coming from in 3D space. If you have a large character who speaks, does the voice emanate from their mouth position? their throat position? their body position? What happens if this sound origin moves very near the players head but the character does not? I haven't tried this yet, but one possibility is to play the sound from multiple positions at the same time.
It's also very important to note the importance of sound within VR. Remember, VR is not just an HMD. It's an artificial world which immerses the players senses. Everything that can have a sound effect, should have a believable sound effect. I would suggest that while people think that VR is all about visual effects, sound makes up for 50% of the total experience.
When it comes to narrative and voice overs, the production process is relatively unchanged from other forms of media. Whatever you do, just test everything out to make sure it makes sense and fits the context of the experience you're creating.
This is probably one of the most difficult problems roomscale VR developers face. The dilemma is that a player is going to be playing in a room with a fixed size, such as a living room. Let's pretend that its 5 meters by 5 meters. When the player moves through this play space, their avatar should move through it with a corresponding 1:1 ratio. What happens if the game world is larger than the size of the players living room? Maybe your game world is 1km by 1km, but the players living room is 5m by 5m. The solution you choose is going to be dependent on the game you're creating and how its designed, so there isn't a 'silver bullet' which works for every game.
Here are my own requirements for a locomotion solution:
It has to be intuitive, easy and natural to use. Player instruction should be minimal.
It can't use hardware the player doesn't have. In other words, use out-of-the-box hardware.
It can't make players sick
It should not detract from the gameplay
It should not break the game design
These are some of the locomotion solutions:
Option 1: Design around it. Some notable VR developers have decided that the playable VR area is going to be exactly the size of the players play area. "Job Simulator" will have the world dynamically resize to fit the size of your playable area, but the playable area isn't much larger than a cubicle. "Hover Junkers" designs around the locomotion problem by having the player standing on a moving platform, and the moving platform is the size of their playable area. Fantastic Contraption does the same thing. These are fine work arounds, and if you never have to solve a locomotion problem...it's never going to be a problem!
Option 2: Teleportation. This seems to be the 'industry standard' for movement around a game world which is larger than the living room. There are many variations of this technique, but it essentially all works the same: The player uses the motion controller to point to where they want to go, press a button, and they instantly move there. One benefit of this method is that it causes very little motion sickness and you can keep the players avatar within a valid position in the game world. One huge draw back of this is that it can cause certain games to have broken mechanics -- if you are surrounded by bad guys, it is much less intense if you can simply just teleport out of trouble. You also have a potential problem where players can teleport next to a wall and then walk through it in room space.
Option 3: Rotate the game world. This is a novel technique someone else came up with recently, where you essentially grab the game world and rotate the world around yourself. As you reach the edge of your living room, the player would turn 180 degrees and then grab the game world and rotate it 180 degrees as well. This *works* but I anticipate it has several draw backs in practice. First, the player has to constantly be interrupting their game to manage their position state within VR. This is very immersion breaking. Second, if the players living room is very small, the number of times they have to grab and rotate the world is going to become very frequent. What portion of the game would players be spending walking around and rotating the world? The third critique is that when the player grabs and rotates the world, they are effectively stopping their movement, so they're constantly stopping and going in VR. This won't work well for games where you have to run away from something, but could work well for casual environment exploration games.
Option 4: Use an additional hardware input device, such as Virtuix Omni. This is a perfect ideal. You don't have to know the direction of the players torso because locomotion is done by the players own feet. It's also a very familiar locomotion system which requires no training and interruption of hands. However, there are going to be three critical drawbacks. First, *most* people are not going to actually have this hardware available, so you have to design your VR game for the lowest common denominator in terms of hardware support. Second, I believe it's going to be a lot more physically tiring to constantly run your legs (would anyone feel like a hamster in a hamster wheel?). This puts physical endurance limits on how long a player can play your VR game (12 hour gaming sessions are going to be rare.) Third, the hardware holds your hips in place, so ducking and jumping is not going to be something players can do easily. Aside from these issues, this would be perfect and I look forward to a future where this hardware is prolific and polished.
Option 5: "Walk-a-motion". This is the locomotion technique I recently invented, so I'm a bit biased. I walk about a mile to work every day and I noticed that when I walk, I generally swing my arms from side to side. So, my approach is to use arm swinging as a locomotion technique for VR. The speed at which you swing your arms determines your movement speed, and its on a tiered levels, so slow leisurely swings will make you walk at a constant slow pace. If you increase your arm speed, you increase your constant walk speed. If you pump your arms a lot faster, your character runs. This moves the avatar in the direction of the player torso orientation, so it's important to know where the torso is facing despite head orientations. The advantage of this system is that it acts as an input command for the avatar to walk forward, so you automatically get in game collision detection. To change avatar walk directions, you just turn your own torso in the direction you want to walk forward. You can still walk around within your play area as usual, though it can become disorienting to walk backwards while swinging your arms to move forwards. This also does require you to use your arms to move, so that creates a significant limitation on being able to "run and gun" in certain games. It also looks kind of stupid to stand in place and swing your arms or move them up and down furiously, but you already look kind of silly wearing an HMD anyways. It's also a lot less tiring than running your feet on a treadmill, but it's also slightly less 'natural'. No additional hardware is required however, and no teleportation means the player can actually be chased by monsters or run off of cliffs.
To get this to work, I have two tracks on either side of the player character. I grab the motion controller positions and project their points onto these two tracks (using dot products). If the left hand value is decreasing and the right hand value is increasing, or vice versa, and the hands are below the waist, we can read this as a movement input. I also keep track of the hand positions over the last 30 frames and average them together to smooth out the inputs so that we're moving based on a running average instead of frame by frame inputs. Since we're running anywhere between 75-90 frames per second, this is very acceptable and responsive. It's worth noting that natural hand swinging movements don't actually move in a perfect arc centered on the torso. Through testing, I found that my arms move forward about twice as far as they move backwards, so this informs where you place the tracks. I've also experimented with calibrating the arm swinging movements per player, but there is a danger that the player will swing their arms around wildly and totally mess up the calibration. You will want to keep the movement tracks at the sides of the player, so you'll have to either read the motion controller inputs in local space or transform the tracks into a coordinate space relative to the avatar.
A future optimization could be to use traced hand arcs and project the motion controller positions onto them, but after trying to implement it, I realized it was additional complexity without a significant gain.
Option 6: Push button locomotion: This is by far the simplest locomotion solution, where you face the direction you want your avatar to travel, and then you push a button on your motion controller. While its simple to implement, it does have a few limitations as well. First, you will be using a button to move forward. The motion controllers don't have many buttons, so this is quite a sacrifice. Second, the button has two states: up or down. So, you're either moving forward at maximum speed or not moving at all. The WASD keyboard controls have the same limitation, but it is familiar. If you want the player to use a game pad instead of motion controllers, you can also use the thumbsticks to give lateral and backwards movements. However, I don't recommend using game pads for room scale VR because the cords are generally not long enough and you lose out on the new capabilities of motion controllers.
Option 7: Move on rails: Some games will have the avatar moving on a preset, fixed pathway. Maybe your avatar is walking down a trail at a constant speed, and you only control where they look and what they do? This can work well for certain games such as rail shooters, but it does mean that the freedom of movement and control is taken away from the player.
Option 8: The player is riding on something that moves: In this case, you might be riding something like a horse and you're guiding its movement by pulling on reins. Or maybe you're on a magic carpet and you steer the carpets movement by rotating a sphere of some sort. These are really good alternative solutions, though there is one pretty big limitation: You can't really use these convincingly indoors.
Option 9: A combination of everything and fakery: If you are very careful about how you design your levels and environments, you could totally fake it and make it seem like there is actually locomotion when there really isn't. For example, if the player is walking around within a building, don't let the dimensions of the building be larger than the play space. If the player exits the building, perhaps they have to get on a horse to cross the street to get to the next building. Or perhaps get in a car and drive to the next town over. Maybe when a player enters into a new building, the orientation of the building interior is designed to align with the players location so that you maximize the walkable space. The trick is to figure our clever ways to make the player not move outside the bounds of their play space while giving the appearance that they're moving vast distances in the VR world, and to do that, you want to minimize the actual amount of walking around the player has to do.
This has been a problem which has challenged a lot of room scale VR developers. The problem is that players know that the virtual reality environment is not real, so if there is a blocking obstacle or geometry of some sort and the player can move by walking around in their living room, there is no reason why the player doesn't just walk through the object in virtual reality. This means that walls are merely suggestions. Doors are just decorations, whether they're locked or not. Monsters which would block your path can simply be passed through. In a sense, the player is a ghost who can walk through anything in VR because nothing is physically stopping them from doing so in their play space. This can be dangerous as well if a player is in a tall building and decides to walk through the wall and exits the building and falls (motion sickness!). Some developers have chosen to create volumes which cause the camera to fade to black when the player passes through a blocking volume. Other VR developers acknowledge the problem and claim that it's against a players psychological instincts to pass through blocking geometry, so they ignore it and let players stop themselves. Through experimentation, I found that intentionally walking through walls in VR has a secondary danger: You forget which walls are virtual and which ones are not (SteamVR has a chaperone system which creates a grid withing VR which indicate where your real world walls are located).
I spent a week trying to figure out how to solve this problem, and I can finally say that I have solved it. I can confidently say that I am an idiot because it took me so long to find such a simple solution. Let's back up for a moment though and examine where we've all been going wrong. The "wrong" thing to do is read the HMD position every frame and set the avatar to its corresponding position in world space. For example, if your HMD starts at [0,0,0] on frame #0 and then goes to [10,0,0] on frame #1, you don't set the avatar position to the equivalent world space coordinates. That's what I was doing, and it was wrong! What you actually want to do is get the movement delta for the HMD each frame and then apply an accrued movement input request to your avatar. So, in the example above, the delta would be calculated by subtracting the last frame HMD position [0,0,0] from the current frame HMD position [10,0,0] to get [10,0,0]. You then want to give your avatar a movement input on the resulting direction vector. This would cause your head movement to be a movement input, no different than a WASD keyboard input. If the avatar is blocked by geometry, it won't pass through it. And you can safely set the HMD camera on the characters shoulder so that it isn't clipping through the geometery as well. In effect, you can't block the player from physically moving forward in their living room, but you can block the avatar and the HMD camera from moving forward with the player. It's not quite unsettling, but players will quickly learn that they can't move through solid objects in VR and they'll stop trying. Problem solved. It took me a week to figure out where I went wrong and the solution was so elegantly simple that I felt like the worlds dumbest developer (my other attempts were ten times more complicated, but were a necessary step in the eventual path to the correct solution).
Conclusion and final thoughts
VR is a very new form of media and there are no established hard rules to follow. There are very few (if any) experts, so take everyones word with a grain of salt. The best way to find what works and doesn't work is to try things out and see if they work. I find that it saves a lot of time to spend a good ten minutes trying to think through a design implementation in my imagination and playing it through in my head before trying to implement it. It's a lot faster and more efficient to anticipate problems in the conceptual phase and fix them before you start implementing them. In my experience, you really can't do a lot of "up front" design for VR. There's a high risk that whatever you come up with just won't work right in VR. You'll want to use a fast iteration software development life cycle and test frequently. You'll also want to get as many people as you can to try out your VR experience so that you can find trouble areas, blind spots, and figure out where your design assumptions are wrong.
You'll also want to monitor your own health state carefully. Developing in VR and iterating rapidly between VR and game dev can cause wooziness, and these woozy feelings can actually slow down the pace of your development as you're trying to mentally recover from the effects. Take breaks!
I have spent the last year and a half developing a game in my spare time in Unity! I am releasing it soon on Steam. Ant Empire is a strategic remake of some older games. It is influenced by games such as Ant Empire and Civilization.
I am currently doing a kickstarter to help fund an AI before launch.
I have attached some images (tried some gifs but they were too large) to show the current stage of Ant Empire, which is nearly completed.
So, initially I was planning to create a base class, and some inherited classes like weapon/armour/etc, and each class will have an enum that specifies its type, and everything was going ok until I hit "usable items".
I ended up with creating UsableItem class, and tons of inherited classes, like Drink/Apple/SuperApple/MagickPotato/Potion/Landmine/(whatever that player can use) each with unique behaviour. I planned to store items in the SQLite database, but I discovered that there are not many ways of creating variables(pointers) with type determined at runtime (that preferably get their stats/model/icon/etc from DB). So, I think that I need to use some variation of the Factory pattern, but I have no idea how I should implement it for this particular case (giant switch/case 😂 ).
It would be really nice if you guys can give me some advice on how I should manage this kind of problem or maybe how I should redesign the inventory.
Inventory storage is an array of pointers. I'm working with CryEngine V, so RTTI can't be used.
virtual ~InventoryItem() = default;
virtual ItemType GetType() = 0;
virtual string GetName() = 0;
virtual string GetIcon() = 0;
virtual void Destroy()
//TODO: Notify inventory storage
class UsableItem : public InventoryItem
virtual CryMT::vector<Usage> GetUsages() = 0;
virtual void UseItem(int usage) = 0;
class TestItem : public UsableItem
int Counter =0;
ItemType GetType() override
string GetName() override
string GetIcon() override
CryMT::vector<Usage> GetUsages() override
void UseItem(int usage) override
CryMT::vector<Usage> uses = GetUsages();
for (int i =0; i<uses.size(); i++)
2019 Asia VR&AR Fair&Summit (VR&AR Fair 2019)
Date: May 9-11, 2019
Venue: China Import&Export Fair Complex, Guangzhou, China
Address: No. 382, Yuejiang Zhong Road, Guangzhou, China
Hosted by: Guangdong Grandeur International Exhibition Group
Co-organized by: Guangzhou Virtual Reality Industry Association, Guangdong VR Industry Alliance, Shenzhen Virtual Reality Industry Federation, Shenzhen Virtual Reality Industry Association
As a thematic exhibition of China Guangzhou International Leisure & Recreation Expo (GILE),VR&AR Fair has been successfully held for two consecutive years (Twice a year, Guangzhou and Wuhan), becoming the One and Only Professional Demonstration and Trade Platform of the VR&AR Industry in Guangzhou. Over the years, we has gathered numbers of famous companies at home and abroad to display their latest products and technologies at VR&AR Fair, including JD.COM, 3Glassess, DP VR, Leke VR, PiXYZ Software (France), ICAROS(Germany), Ai6tech (Taiwan), VRway, Hirain, Royole, Super Captain, TPCAST VR, Shenlinqijing, Foldspace, NineD, TPcast, etc.
VR&AR Fair 2019 is estimated to host over 250 exhibitors and co-located with 2019 Asia Amusement&Attractions Expo (AAA 2019) on a show floor of 10,0000 sq.m, which are going to cover sectors like VR helmet accessories, VR all-in-one machine, interactive multimedia products, immersive games and devices, immersive digital cinemas, virtual walk-through products, auto stereoscopic (glasses-free), 3D-9D cinema devices, multi-touch devices, AR equipment, AR game, environment modeling technology, realistic sensor (real-time) rendering technology, amusement equipment, Theme Park Facilities,etc.
2019 Asia Science Museum & Exhibition Hall Facilities Expo
2019 Asia Multimedia Technology & Interactive Projection Expo
The 10th Asia Theater & Filming Equipment Fair 2019
2019 Asia Amusement & Attraction Expo
Why VR&AR Fair 2019?
1. Win face-to-face business opportunities
2. Seek professional buyers
3. Come into contact with business decision makers
4. Maintain existing clients and find new clients
5. Effectively improve brand image & awareness
6. Accurately position your company and brand
7. Grab market shares
8. Get access to the latest industrial development
9. Vigorously expand product and service scope
10. Establish extensive agent and investment attraction network
Discover Your Opportunity at VR&AR Fair 2019!
Director of Guangdong Grandeur International Exhibition Group Tel: 82-20-29806525
Greetings fellow gaming enthusiasts!
I am the Product Manager for a VR game called Funny Farm VR.
We have already have a proof of concept build of the game that is published on iOS, Android, Oculus Rift, Oculus Go, GearVR, Pico and Niburu platforms. Not a bad start eh?
It's a 3D VR game built on the Unity platform. We are now looking to further develop the game concept and build on what we have to achieve the following:
Introduce a game economy (power ups / rewards / currency)
Add more levels
We need 1 or 2 developers with experience in developing on the Unity 3D platform as well as a 3D artist and animator.
This is a great opportunity to work on a fun game and inject your own ideas / personality into it.
If you're interested in getting involved drop me a message and I'll get in touch.
Looking forward to putting a team together!
Effekseer Project develops "Effekseer," which is visual software for creating open source games; on September 13,
I released "Effekseer 1.4," which is the latest major version release.
Effekseer is a tool to create various visual effects used in games and others.
With Effekseer, you can easily create various visual effects such as explosion, light emission, and particle simply by specifying different parameters.
Effekseer's effect creation tool works on Windows and macOS.
The created visual effects can be viewed on Windows, macOS, Linux, iOS, Android and other environments with DirectX, OpenGL and so on.
In addition, there are plugins / libraries for game engines such as Unity and UnrealEngine4 to view visual effects.
Effekseer 1.4 is an updated version of Effekseer 1.3 released in November 2017.
This update contains the following changes:
The renewal of UI.
Support the tool for macOS.
Addition of a function to read FBX with animation.
Addition of parameters to protect collied effects and objects.
Addition of parameters for easier control of the effects.
In addtion I improve plugins/libraries for Unity, UnrealEngine4 and Cocos2d-x.
Besides that, more than 40 new sample effects have been added and many bugs have been fixed.
Effekseer 1.4 is available on the project website.
The license for the software is the MIT license.