Search the Community

Showing results for tags 'VR'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Categories

  • Audio
    • Music and Sound FX
  • Business
    • Business and Law
    • Career Development
    • Production and Management
  • Game Design
    • Game Design and Theory
    • Writing for Games
    • UX for Games
  • Industry
    • Interviews
    • Event Coverage
  • Programming
    • Artificial Intelligence
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Engines and Middleware
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
  • Archive

Categories

  • News

Categories

  • Audio
  • Visual Arts
  • Programming
  • Writing

Categories

  • Audio Jobs
  • Business Jobs
  • Game Design Jobs
  • Programming Jobs
  • Visual Arts Jobs

Categories

  • GameDev Unboxed

Forums

  • Audio
    • Music and Sound FX
  • Business
    • Games Career Development
    • Production and Management
    • Games Business and Law
  • Game Design
    • Game Design and Theory
    • Writing for Games
  • Programming
    • Artificial Intelligence
    • Engines and Middleware
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
    • 2D and 3D Art
    • Critique and Feedback
  • Topical
    • Virtual and Augmented Reality
    • News
  • Community
    • GameDev Challenges
    • For Beginners
    • GDNet+ Member Forum
    • GDNet Lounge
    • GDNet Comments, Suggestions, and Ideas
    • Coding Horrors
    • Your Announcements
    • Hobby Project Classifieds
    • Indie Showcase
    • Article Writing
  • Affiliates
    • NeHe Productions
    • AngelCode
  • Workshops
    • C# Workshop
    • CPP Workshop
    • Freehand Drawing Workshop
    • Hands-On Interactive Game Development
    • SICP Workshop
    • XNA 4.0 Workshop
  • Archive
    • Topical
    • Affiliates
    • Contests
    • Technical

Calendars

  • Community Calendar
  • Games Industry Events
  • Game Jams

Blogs

There are no results to display.

There are no results to display.

Marker Groups

  • Members

Developers


Group


About Me


Website


Industry Role


Twitter


Github


Twitch


Steam

Found 61 results

  1. I wanted to start by sharing some amazing video Making Of, medical environments visual effects using 3dsMax and thinkingParticles as can be seen here: cebas vimeopro Please let me know 1) is 3dsMax environment compatible with the main game engines : Unreal and Unity? 2) Anyone with games that uses medical molecular environments? If you wish to get to know the story, see here: Random42 Feature Exclusive Story Also, follow at facebook.com/ScientificVisuals for upcoming tutorial on how to make a blood repair / clotting visual effects by Callum Welsh.
  2. Welcome to Day 38! Today, we’re going to talk about the limitations of mobile VR and make some changes in our game to fix things. We’ve already started to fix some things, specifically adding event triggers to our enemies, but there’s still many more things to solve! Here’s a quick list of things I want to tackle from what we encountered 2 days ago: From a technical limitation: We can’t move We only have one input which is clicking Some actual technical problems: The enemies are all black color We don’t have any of our UI’s anymore We’re going to address these problems over the next couple of days. Today, we’re going to focus on the technical limitations of Mobile VR, today’s priorities are: Discussing how to change our game design to accommodate our new limitations Implementing our new designs Edit, Important Note: After playing around with the Cardboard in Unity today and looking at this article about Google Cardboard’s inputs. It seems that we don’t have to use Google VR SDK. Unity already has most of the internal integration necessary to make a VR app Everything we had already works, the reason why there I initially thought there was a problem is, because of how we did raycasting. Specifically, our raycasting code targeted where our mouse/finger was touching, not the middle of the screen! More on this later. Step 1: Changing the Game to Fit our Mobile Limitations Like mentioned before, in the Google Cardboard, we have 3 limitations: We can’t move our characters position We only have tapping as an input to interact with the game Our cursor will always be in the middle of the screen Even for the Daydream Viewer, we will have the first 2 limitations. However, with the new Daydream Standalone device coming out, we’ll have World Space, finally allowing us to track the player’s movements without requiring external devices like what the Vive does! Anyways, back on topic. Considering these 3 limitations, here are my thoughts of what needs to be changed in our game: Because we can’t move, we should place our character in a more centered location for the enemies to reach us Because we can no longer run away, we should make the enemies weaker so that we don’t get swarmed Because we only have one input, we can shoot, but we can’t reload, we should get rid of the reload system Essentially, we’re going to create a shooter with our player in the center with enemies coming from all around us. Step 2: Implementing Our New Designs Now that we have everything we want to do planned, let’s get started in the actual implementation! Step 2.1: Placing the Character in the Middle Let’s place the character in the middle of where our spawn points are set. After playing around with it, I think the best spot would be at Position: (100, 1, 95) Select Player in our hierarchy. In the Transform component, set our Position to be X: 100, Y: 1, Z: 95 Step 2.2: Making the Enemies Weaker Next up, let’s make the enemies weaker. In the Enemy Health script component attached to our Knight, Bandit, and Zombie prefab, let’s change their health value. In order of our health, the order of size from largest to smallest is: Zombie > Knight > Bandit. Let’s set the health to be: Zombie: 4 HP Knight: 2 HP Bandit: 1 HP Here’s how we change our health: In Assets > Prefabs select our prefabs, in this case, let’s choose Zombie. In the Inspector, select the Enemy Health (Script) component and change Health to be 4 Do the same change with the other 2 prefabs. Step 2.3: Remove our ammo system Now it’s time to back to our Player Shooting Controller (Script) Component that we disabled yesterday. I want to keep the animation and sound effects that we had when shooting our gun, however, I’m going to get rid of the ammo and the need to reload. Here are my changes: using UnityEngine; using System.Collections; public class PlayerShootingController : MonoBehaviour { public float Range = 100; public float ShootingDelay = 0.1f; public AudioClip ShotSfxClips; public Transform GunEndPoint; //public float MaxAmmo = 10f; private Camera _camera; private ParticleSystem _particle; private LayerMask _shootableMask; private float _timer; private AudioSource _audioSource; private Animator _animator; private bool _isShooting; //private bool _isReloading; //private LineRenderer _lineRenderer; //private float _currentAmmo; //private ScreenManager _screenManager; void Start () { _camera = Camera.main; _particle = GetComponentInChildren<ParticleSystem>(); Cursor.lockState = CursorLockMode.Locked; _shootableMask = LayerMask.GetMask("Shootable"); _timer = 0; SetupSound(); _animator = GetComponent<Animator>(); _isShooting = false; //_isReloading = false; //_lineRenderer = GetComponent<LineRenderer>(); //_currentAmmo = MaxAmmo + 10; //_screenManager = GameObject.FindWithTag("ScreenManager").GetComponent<ScreenManager>(); } void Update () { _timer += Time.deltaTime; // Create a vector at the center of our camera's viewport //Vector3 lineOrigin = _camera.ViewportToWorldPoint(new Vector3(0.5f, 0.5f, 0.0f)); // Draw a line in the Scene View from the point lineOrigin in the direction of fpsCam.transform.forward * weaponRange, using the color green //Debug.DrawRay(lineOrigin, _camera.transform.forward * Range, Color.green); if (Input.GetButton("Fire1") && _timer >= ShootingDelay /*&& !_isReloading && _currentAmmo > 0*/) { Shoot(); if (!_isShooting) { TriggerShootingAnimation(); } } else if (!Input.GetButton("Fire1") /*|| _currentAmmo <= 0*/) { StopShooting(); if (_isShooting) { TriggerShootingAnimation(); } } /*if (Input.GetKeyDown(KeyCode.R)) { StartReloading(); }*/ } private void StartReloading() { _animator.SetTrigger("DoReload"); StopShooting(); _isShooting = false; //_isReloading = true; } private void TriggerShootingAnimation() { _isShooting = !_isShooting; _animator.SetTrigger("Shoot"); //print("trigger shoot animation"); } private void StopShooting() { _audioSource.Stop(); _particle.Stop(); } public void Shoot() { //print("shoot called"); _timer = 0; Ray ray = _camera.ViewportPointToRay(new Vector3(0.5f, 0.5f, 0f));//_camera.ScreenPointToRay(Input.mousePosition); RaycastHit hit = new RaycastHit(); _audioSource.Play(); _particle.Play(); //_currentAmmo--; //_screenManager.UpdateAmmoText(_currentAmmo, MaxAmmo); //_lineRenderer.SetPosition(0, GunEndPoint.position); //StartCoroutine(FireLine()); if (Physics.Raycast(ray, out hit, Range, _shootableMask)) { print("hit " + hit.collider.gameObject); //_lineRenderer.SetPosition(1, hit.point); //EnemyHealth health = hit.collider.GetComponent<EnemyHealth>(); EnemyMovement enemyMovement = hit.collider.GetComponent<EnemyMovement>(); if (enemyMovement != null) { enemyMovement.KnockBack(); } /*if (health != null) { health.TakeDamage(1); }*/ } /*else { _lineRenderer.SetPosition(1, ray.GetPoint(Range)); }*/ } // called from the animation finished /*public void ReloadFinish() { _isReloading = false; _currentAmmo = MaxAmmo; _screenManager.UpdateAmmoText(_currentAmmo, MaxAmmo); }*/ private void SetupSound() { _audioSource = gameObject.AddComponent<AudioSource>(); _audioSource.volume = 0.2f; _audioSource.clip = ShotSfxClips; } public void GameOver() { _animator.SetTrigger("GameOver"); StopShooting(); print("game over called"); } } I’ve kept what I commented out, here’s the clean version of our script. using UnityEngine; using System.Collections; public class PlayerShootingController : MonoBehaviour { public float Range = 100; public float ShootingDelay = 0.1f; public AudioClip ShotSfxClips; public Transform GunEndPoint; private Camera _camera; private ParticleSystem _particle; private LayerMask _shootableMask; private float _timer; private AudioSource _audioSource; private Animator _animator; private bool _isShooting; void Start () { _camera = Camera.main; _particle = GetComponentInChildren<ParticleSystem>(); Cursor.lockState = CursorLockMode.Locked; _shootableMask = LayerMask.GetMask("Shootable"); _timer = 0; SetupSound(); _animator = GetComponent<Animator>(); _isShooting = false; } void Update () { _timer += Time.deltaTime; if (Input.GetButton("Fire1") && _timer >= ShootingDelay) { Shoot(); if (!_isShooting) { TriggerShootingAnimation(); } } else if (!Input.GetButton("Fire1")) { StopShooting(); if (_isShooting) { TriggerShootingAnimation(); } } } private void TriggerShootingAnimation() { _isShooting = !_isShooting; _animator.SetTrigger("Shoot"); } private void StopShooting() { _audioSource.Stop(); _particle.Stop(); } public void Shoot() { _timer = 0; Ray ray = _camera.ViewportPointToRay(new Vector3(0.5f, 0.5f, 0f)); RaycastHit hit = new RaycastHit(); _audioSource.Play(); _particle.Play(); if (Physics.Raycast(ray, out hit, Range, _shootableMask)) { print("hit " + hit.collider.gameObject); EnemyMovement enemyMovement = hit.collider.GetComponent<EnemyMovement>(); if (enemyMovement != null) { enemyMovement.KnockBack(); } } } private void SetupSound() { _audioSource = gameObject.AddComponent<AudioSource>(); _audioSource.volume = 0.2f; _audioSource.clip = ShotSfxClips; } public void GameOver() { _animator.SetTrigger("GameOver"); StopShooting(); print("game over called"); } } Looking through the Changes We removed a lot of the code that was part of the reloading system. We basically removed any mentions of our ammo and reloading, however, I kept the changes involved with the shooting animation, shooting sound effects, and shooting rate. There were only 2 changes that were made: I changed the input we use to shoot from GetMouseButton to GetButton(“Fire1”), I believe this is the same thing, but I’m making the change anyways. Either option returns true when we’re touching the screen on our mobile device. I also changed our Ray from our raycasting system. Before casted a ray from where our mouse was located at, which before we fixed at the center. However, after we got rid of the code that fixed cursor to the middle, we needed a new way to target the middle. Instead of firing the raycast from our mouse, we now fire the raycast from the middle of our camera, which will fix our problem with our mobile device. Go ahead and play the game now. We should be able to have a playable game now. There are 2 things that will happen when we shoot: We’ll shoot a raycast and if it hits the enemy, they’ll be pushed back The enemies trigger event will detect that we clicked down on the enemy, so they’ll take some damage At this point, we have a problem: if we were to hold down the screen, we’ll push the enemy back, but they’ll only be hit once! That’s because we only have that deals with an OnClick event, but not if the user is currently selecting them. We’re going to fix this problem tomorrow, but I’ve done a lot of investigation work with raycasts now and want to take a break! Step 2.4: Changing the ScreenManager script One more thing we need to do before we leave. The Unity compiler would complain about a missing reference with our ScreenManager, specifically with the MaxAmmo variable that we got rid of. Let’s just get rid of it: using UnityEngine; using UnityEngine.UI; public class ScreenManager : MonoBehaviour { public Text AmmoText; void Start() { { PlayerShootingController shootingController = Camera.main.GetComponentInChildren<PlayerShootingController>(); //UpdateAmmoText(shootingController.MaxAmmo, shootingController.MaxAmmo); } } public void UpdateAmmoText(float currentAmmo, float maxAmmo) { AmmoText.text = currentAmmo + "/" + maxAmmo; } } And we’re good to go! Technically speaking, we won’t be using this script anymore either. Conclusion And another day’s worth of work has ended! There’s a lot of things I learned about VR, such as: we don’t need ANYTHING that the Google VR SDK provides! Unity as a game engine already provides us with everything we need to make a VR experience. Google’s SDK kit is more of a utility kit that help make implementation easier. The TLDR I learned today is that we don’t have to be fixed on using Unity’s Raycasting script, we don’t need it. We can continue to use what we already have. However, for the sake of learning, I’m going to continue down re-implementing our simple FPS with the Google Cardboard assets! We’ll continue tomorrow on Day 39! See you then! Day 37 | 100 Days of VR | Day 39 Home
  3. Writers Note: Sorry for the lack of updates, I'm not dead yet! I went on a short vacation and got horribly sick. I'm hoping to continue as I have been going before. Welcome to Day 37! Today, things are finally going to get serious in working with VR! Currently, there are a lot of problems with the app from when we launch it. Some are just limitations and others are actual problems. However, before we start to go in and fix everything that we encountered yesterday, today we’re going to add the Google VR SDK into our game. Today we’re going to: Set up wireless debugging so we can both: debug and receive console print statements from our game Remove the old scripts that we’ll replace with VR Add the Google VR SDK into our game Set up a simple Event Trigger Today, we’re going to start working in VR, so let’s not waste time and get to it! Step 1: Setting Up Wireless Debugging/Consoles Before we do anything, one of the most important thing we should do is to set up remote debugging or at the very least, the ability to have our console statements be sent to Unity untethered. Currently, in Development mode, we only get console logs from our Android device if our phone is connected to our computer. This wire would become too limiting if we must do something like spin around in a circle. To fix this, we’re going to set up wireless debugging where our phone can send data remotely to Unity. We’re going to follow Unity’s documentation on attaching a MonoDevelop Debugger to an Android Device. The instructions are straightforward, so I’ll just leave the link to the instruction. In our current state, because we have no way of restarting the game, we must rebuild and run every single time we want to see the console wirelessly. The reason being we lost the ability to restart the game inside the game. However, when we re-add our ability to restart the game, wireless debugging will be more useful. Step 2: Removing Old Scripts that Needs VR Replacements It’s finally time to start working in Unity! Before we do anything, let’s think about the things that the Google VR SDK gave us and what must we get rid of in our current system that conflicts with the VR SDK. The main thing that the Google VR SDK provides is: The ability to move the camera with our head Its own Raycasting system What we need to remove from our game is: The ability to move our character The ability to move our camera The ability to shoot The crosshair UI Luckily for us, this process is going to be fast and easy. First, let’s remove our ability to move: In our game hierarchy, select the Player game object. Select the little cog on the top right-hand corner of the Player Controller (Script) and select Remove Component Next, let’s get rid of the game following our mouse. Select Player > Main Camera Remove our Mouse Camera Controller (Script) Component After that, let’s get rid of the shooting script. We’re going to come back later and re-purpose this script, but that’s going to be for a different day: Select Player > Main Camera > MachineGun_00 Disable the Player Shooting Controller (Script) We’re still going to need this. Finally, let’s get rid of the crosshair. As you recall, when we add the VR SDK, we get a gaze prefab that already adds a cursor in for us. Select HUD > Crosshair and delete it from our hierarchy. When we’re done, we’ll have a completely unplayable game! Yay…. Step 3: Adding the Google VR SDK in Recalling from the Google Cardboard demo, for our game, we’ll need to add: GvrEditorEmulator – to simulate head movement GvrEventSystem – to use Google’s Event System for dealing with raycasting GvrReticlePointer – for our gaze cursor GvrPointerPhysicsRaycaster – The Raycaster that GoogleVR uses to hit other objects The set up for this will also be very straightforward. Drag GvrEditorEmulator in Assets > GoogleVR > Prefabs > GvrEditorEmulator to the hierarchy Drag GvrEventSystem in Assets > GoogleVR > Prefabs > EventSystem to the hierarchy Drag GvrReticlePointer in Assets > GoogleVR > Prefabs > Cardboard to be the child of Main Camera Selectcs from Assets > GooglveVR > Scripts > GvrPointerPhysicsRaycaster.cs and attach it to our Main Camera. When we’re done, we’ll have something like this: Now with these prefabs and scripts in, we can rotate and look around our game by holding Alt. We can also shoot our raycasts with our VR Raycaster, however right now we don’t have an Event Trigger set up in our enemies that will detect them getting hit. Let’s do that! Step 4: Setting Up an Event Trigger Before we end today, I want to make a simple event trigger that allows us to be able to defeat an enemy. Luckily for us, we already have the function available to us! Specifically, inside our Enemy Health script, we have a code that we call to damage an enemy. Let’s set this up. We want to get something like this: For now, we’re only going to change our Knight enemy. Here’s what we’re going to do: Select our Knight prefab in Assets > Prefab > Knight Add an Event Trigger Component to our prefab. Click Add New Event Type to select what type of event we want to listen for Select PointerClick Now click + to add the object we want to access the scripts of. Drag our Knight Prefab into the empty Object slot Then we need to select the function to call: EnemyHealth > TakeDamage(float) Set the float value we pass in as 1 When we play our game now, when our gazer focuses on an enemy and we click, we’ll shoot him! There are a lot of things that we’re missing like the push back, but we can start focusing on the rest of that tomorrow! Now let’s do that to the rest of our prefabs: Bandit and Zombie! Conclusion There we have it! Our first dive into doing some work with VR. It turns out right now, there’s a lot less code that needs to be written, instead, a lot of it is just putting prefabs and scripts to the correct location so our game would work. Either way, now we have a game that is playable. Tomorrow, we’re going to discuss what changes that we should do to make a better VR experience. Or at the very least, as good as it was before we try to VR-ify it! Phew, it’s been a long day, I’ll see you all tomorrow on day 38! Day 36 | 100 Days of VR | Day 38 Home
  4. For the past few years in a row, mobile games dominated the app stores regarding revenue, download number and engagement. No other app niche has shown such huge level of activities and soaring numbers as the mobile games. Naturally, mobile games also have been the most happening niche in respect of new technologies and innovations. From Augmented Reality and Virtual Reality games to wearable gaming, new technologies are continuing to shape gaming experience. Mobile game marketing has also come of age and now has become more mature than ever before. The era of so-called ‘freemium’ games, gated features and in-app monetisation tactics look common, and every game marketer is on the lookout for a new way to market and generate revenue. Considering all these evolving attributes, 2018 has really been a happening year for mobile games. Let us introduce here some of the key trends that will shape mobile game development in 2018. 1. VR and AR Gaming When Pokémon GO was released and literally took the world by storm with its never-before gaming experience in 2016, many experts just didn't think twice to mark that year with the rise of VR and AR games. But that was just the beginning of the kind of mobile games which will continue to dominate the gaming scene for the mobile and wearable users in the years to come. The success of Pokemon Go actually became a motivating factor for game developers worldwide, and such reality defining games continued to come making the scene even more competitive. Though 2017 has not seen any new era defining AR or VR games like Pokemon Go, both the technologies have been consolidated and became mainstream as the gaming technologies of the future. 2. Mobile games for the elderly Certainly, we no longer consider mobile games to be the child's plaything. It is equally for the elderly people, grownup youths, matured ladies and people of all ages. For the past several years we have seen hordes of game apps to come up targeted for elderly population or people outside of common game-playing demographics. In 2017, there have been hundreds of games for elderly, working men and women and all other age groups. With many games enjoying the favour of an addicted game playing audience, this trend is going to stay dormant in the time to come. 3. Wearable gaming If we are not terribly wrong, wearable devices can easily be predicted as the next mass mobilising device platform after the smartphones. Naturally, mobile gaming is also supposed to shift its load to the wearable gaming apps. Even if the mobile games are to remain popular because of their large screen gaming experience, the quick to play gaming experience of the smartwatches will continue to remain popular. Moreover, offering an extended gaming experience from the mobile device to the smart wearables, they will help people stay engaged with a game irrespective of the device. 4. Social gaming Social gaming is already a hot trend as most of the mobile games are keen to promote the gaming experience by allowing players to invite their players onboard for a gameplay. From a game of pool to most complex and strategy games, most games these days allow players inviting their friends on social media. Moreover, quick social registration for mobile games is already helping games garner access to more people through the social contacts of the player. By incentivising social gaming in many ways, a game app can further push players to engage their friends on social media. 5. Game consoles getting outdated Game consoles like the PlayStation and Xbox are still there, and it is not a coincidence that they are actually getting better with the rise of mobile gaming. In spite of getting better and richer to compete the mobile games, gaming consoles because of their expensive price tag and difficulty of handling will only attract less and less people like the game playing audience. Mobile gaming with high-end sophisticated experience and extreme ease of use will continue to hold the charm that no other gaming devices really can. With the unprecedented rise of mobile gaming in recent times, game consoles are only becoming less competitive options for the gaming audience. 6. Custom game features We are already aware of the power of customisation for the engaging audience. Custom features make a player feel important, and it is something everyone likes almost invariably. It is no different when it comes to mobile games. Mobile games allowing players to choose features that they find enjoyable for their game playing experience, will obviously give them more reasons to stick to a particular game. The custom game features allowing players shaping their own gaming experience have been very popular with mobile games this year. 7. Multichannel access Average smartphone users have access to several gaming devices simultaneously, and this is why allowing unperturbed game play across multiple devices and platforms became very important. Game developers also find it helpful to build a cross-platform game to boost the chances of getting discovered easily across app stores. While engaging game players continuously irrespective of the device in use is one of the most important considerations for the marketing success of a game, allowing unperturbed streaming of the game across devices is crucial. 8. A renewed interest in retro games There has been a renewed interest in the old style mobile games, at least for their look and feel. Dubbed as retro games the new breed of games are introducing the look and feel of older mobile games. This new approach allowing young players having gaming experience of a different generation became quite popular throughout this year. In summation To conclude, in 2017 we have seen several definitive game trends to unfurl allowing new scopes for marketers and developers. Among these trends, the above-mentioned ones seem to have a far-reaching echo than the rest.
  5. Oculus CTO John Carmack has posted the latest Public VR Critique: Daedalus, running on the Gear VR. Daedalus is a platformer and exploration game set in a surrealist world. Carmack appears to really enjoy this game, mentioning that at "OC4, one of the VR apps presented for my review session was Daedalus...I played through the first couple training segments on stage, giving some of my common feedback, but what was noteworthy was that I didn't want to quit - I had to make myself stop to get on to the rest of the slate lined up." In the critique, Carmack covers Daedalus' gameplay, artistic feel, and the pros/cons of the art style in mobile VR. He also reviews performance on Gear VR: Check out the full critique here.
  6. Oculus CTO John Carmack has posted the latest Public VR Critique: Daedalus, running on the Gear VR. Daedalus is a platformer and exploration game set in a surrealist world. Carmack appears to really enjoy this game, mentioning that at "OC4, one of the VR apps presented for my review session was Daedalus...I played through the first couple training segments on stage, giving some of my common feedback, but what was noteworthy was that I didn't want to quit - I had to make myself stop to get on to the rest of the slate lined up." In the critique, Carmack covers Daedalus' gameplay, artistic feel, and the pros/cons of the art style in mobile VR. He also reviews performance on Gear VR: Check out the full critique here. View full story
  7. Welcome back to Day 36! Yesterday we set up our mobile device to be able to play VR device and there’s nothing quite like that experience, especially if you’re the one that “made it”. If you’ve made it this far in the journey, but you haven’t tried using the Cardboard or other VR devices, I highly recommend trying it once! Now… with my pitch out of the way, let’s talk about what we’re going to do today! We finally started working in VR, today, we’re going to try and convert our simple FPS into a VR experience. This will be a multi-day progress. Today, I want to: Do some Asset clean ups for our app so we don’t have to spend forever building Set up our Google VR tools Play our game Getting the Project GitHub Repository Before we get started. I realize that not everyone (or anyone) followed me step by step to get to Day 36. In our Simple FPS game, we have reached a complete prototype so I’ve decided to make a Github Repository of our game before we start adding the VR components in. Specifically, after doing some manual cleanup work in Step 1. Now anyone can start following along to build our VR application. You can find my Simple VR First Person Shooter GitHub repository here. Step 1: (FAILED/SKIP) Clean Up Un-Used Assets If you haven’t tried switching to the Android platform in our simple FPS game, you might notice, that it… takes… FOREVER. I’ve tried building the app (and I waited 10+ minutes). Thank goodness for the Unity previewer, I would not know what to do if I had to build my app on my phone every single time! Luckily, I found that Google/Unity does have a solution for this. Google has Instant Preview. Unfortunately, which loads the game into our phone while also playing on the editor. The bad news is that this only appears to be for the Daydream Viewer so I’m going to hold off for now. However, let’s see what we can do to optimize this! When I say optimize, I really mean get rid of our un-used assets! Looking at what we have, we have 1 GB worth of files! That’s not good! IMPORTANT NOTE Unfortunately, this didn’t exactly work. I tried to export all our dependencies and then import it into a new project and there were some problems. It turns out, things like Layers and Tags do not get preserved so if we wanted everything to work, we had to manually add everything back in. Instead, I used the files I exported into a new project as a reference and then manually removed assets from a copy of our project (that’s what I get for not having source control!) Also from testing with a before and after build time, I believe that un-used assets DO NOT improve our build and runtime, so the only useful thing that came out of Step 1 was that I: Cleared some clutter so we can find files more easily now Made the project smaller so people can download it from Github faster, so not a complete loss! Step 1.1: Exporting our Assets Let’s get rid of our old un-used files! How? Well, a quick search on how to get rid of unused assets in Unity. All we need to do is: Select our scenes, which in this case is just Main Right-click and click: “Select Dependencies” Export our assets by going to Assets > Export Package… and save your package somewhere. Warning: This will not export Editor scripts and plugins, which in our current situation, is not a problem. Now at this point, we have 2 choices: Delete everything and re-import the assets that we exported or… Create a new project and import our exported project I’m going to do the 2nd choice and make a new project. Step 1.2: Importing our Exported Assets We’re going to create a new project and import everything we just exported. To do that: Select File > New Project… Call your file whatever you want, but I’m going to call mine First Person Shooter VR And now we’ll have a fresh new Unity project: Now we need to re-import everything we exported. Go to Assets > Import Package > Custom Package and find the .unitypackage that we just created Afterwards, just choose to re-import everything Step 2: Set Up Our VR Tools and Settings The next thing we need to do after we have our new project is that we need to import the Google VR SDK and configure our Unity. I’m going to be lazy and just refer us to Day 35: Setting Up Google Cardboard In Unity. Just do the exact same thing in downloading the SDK and setting up our Unity configuration. Note: In the repo, I already included Google VR SDK 1.100.1 and the necessary changes for Player Settings. I assume the PlayerSettings are project-based and not computer-based, but if it’s not, follow the instructions in Day 35. Step 3: Playing Our Game At this point, we should be almost ready to play our game! At this point, we have: imported the Google VR SDK Switched to the Android platform configured our PlayerSettings to the appropriate settings to run a Google Cardboard app The last thing we need to do that’s specific to our Game Project is that we try to build in Build Settings… we run into a problem to resolve incompatibilities between Color Space and the current settings. To fix this, we just need to change our Color Space from Linear to Gamma. To do that: Go to File > Build Settings > Player Settings > Other Settings > Color Spaces Change Linear to Gamma With this setting in, we’re ready to build our game. To do that, I recommend running the development build to build our app. Contrary to what the name sounds, development build DOES NOT make our build faster, instead it allows us to have access to useful debugging settings gives us access to the Profiler and Script Debugging. Now once you’re ready, let’s build our game! Make sure your phone is connected to your computer. Go to File > Build Settings… Enable Development Build Click Build And Run You can save the APK anywhere. Now enjoy the long wait~! Conclusion That’s it for today! Today we cleaned up a bit of our project and then set up the game so that we can run our app directly from our phone. The build is long and horrendous, which is unfortunate. There are a couple of solutions available, but I’m going to look at them some other day. We can also play the game directly from Unity. If we were to play the game right now, we’ll encounter problems. From a technical limitation: We can’t move We can’t reload anymore To actual technical problems: The enemies are all black color We don’t have any of our UI’s anymore I’ll have to investigate these things and solve them one at a time, but I can finally say, we’re here! We’re finally working in VR! That’s what we’re going to try and tackle the next couple of days. It’s going to be a fun one! Day 35 | 100 Days of VR | Day 37 Home
  8. Yesterday we looked at how we can work with VR and went through the basic demo and understood how everything worked. Today, we’re going to look at how we can install our game directly into the phone. To do everything today, we need to have: A phone that supports Android Level 19 (Kit Kat) A USB to Micro-USB (or Type-C for newer devices) cable (Optional) Google Cardboard Today we’re going to: Install the Android SDK so we can build and run our app Install a USB driver for our computer to detect our phone Set up our phone to be in developer mode Build and Run the demo app into our phone With all that being said, let’s get started! Today we’ll be following Unity’s Android SDK setup guide Step 1: Install the Necessary Android Software Since we’re building our VR app for Android applications, we need the required software to compile, build, and run our app on our phone. Download and install the latest Java SDK to run Android Studio Download and Install Android Studio You might have to restart your computer first for your computer to recognize the new Java SDK that you installed. When we’re done downloading and installing Android Studio (which will take a LONG time), we want to open the SDK Manager. In our new project, we can find our SDK Manager under Configure. Now we’ll get this: Under SDK Platform, select the platform we want to support, in this, case it’s Android 4.4 for Cardboard and Android 7.0 for DayDream, however, I believe if you install the newest version that’ll work for both. Under SDK Tools, install: Android SDK Platform-Tools Android SDK Tools Google USB Driver if you have a Nexus device With all of this, we should now have everything we need to be able to build our game into our Android device. Step 2: Install a USB Driver to Detect our Phone The next part (and probably the part I hate the most) is installing a USB driver that allows our computer to detect our phone. Go to Google’s documentation on where to find the appropriate OEM USB driver for your phone and install it. With any luck, your computer should be able to successfully recognize your phone when you plug it into your computer. If not, then I refer you to Google this problem as there are too many possibilities of what could have gone wrong. Step 3: Change Your Phone to Developer Mode Now our computer can connect to our mobile device, the final thing we need to do is have our phone be in developer mode so Unity (or Android) can create the app and install it on our phone. The instructions to enable Developer Mode varies depending on what your phone is. A quick Google search should give you what you need to enable it. However, the most common approach these days is to: Go to Settings > About phone > Build Number Click build number 7 times to enable Developer Mode Now under Settings, you should find Developer options. Go into Settings > Developer options and turn on USB Debugging Hopefully, with this step completed, we can finally move on to our configurations in Unity! Step 4: Configuring Unity to Build and Run our Android App Now that our phone is ready, it’s time to finally build our game into Unity. Make sure that your phone is connected to your computer In Unity go to File > Build & Run to create an APK file (our app) that will install it on our computer That’s it. Now in the perfect world, that’s it, we’re done. Enjoy our VR game! Unfortunately, there are always problems that we would encounter: Your API is at the wrong level. You’re missing a Bundle Identifier Failed to compile resources with the following parameters: major version 52 is newer than 51, the highest major version supported by this compiler. The 1st and 2nd problem can be resolved easily. The first problem is because we need to make sure that we create a minimum version of Android devices that have the software we need to run our VR application. In Player Settings under Other Settings… in Minimum API Level select API Level 19 for Google Cardboard support and API Level 24 for Google Daydream. If you choose API Level 24, just make sure that your phone can run Daydream! For the second problem, every Android app has a unique identifier that Google uses to identify the app. The error that we’re getting is that Unity is telling us that we’re using the default one and we should change it. In Player Settings under Other Settings… in Package Name change the string to be something else. Just make sure you follow the convention of <companyname>.<appname>. In our case, it doesn’t matter what it is, we can put anything we want. Now for the third and final problem. This one more interesting. Most likely your error is something like this: Failed to compile resources with the following parameters: -bootclasspath "C:/Users/JoshDesktop/AppData/Local/Android/android-sdk\platforms\android-24\android.jar" -d "C:\Users\JoshDesktop\git\Cardboard\Temp\StagingArea\bin\classes" -source 1.6 -target 1.6 -encoding UTF-8 "com\google\android\exoplayer\R.java" "com\google\gvr\exoplayersupport\R.java" "com\google\gvr\keyboardsupport\R.java" "com\google\gvr\permissionsupport\R.java" "com\google\vr\cardboard\R.java" "com\google\vr\keyboard\R.java" "com\Josh\Chang\R.java" "com\unity3d\unitygvr\R.java" warning: C:\Users\JoshDesktop\AppData\Local\Android\android-sdk\platforms\android-24\android.jar(java/lang/Object.class): major version 52 is newer than 51, the highest major version supported by this compiler. It is recommended that the compiler be upgraded. warning: C:\Users\JoshDesktop\AppData\Local\Android\android-sdk\platforms\android-24\android.jar(java/lang/AutoCloseable.class): major version 52 is newer than 51, the highest major version supported by this compiler. What all of this is saying is that our Java is out of date and we need to have at least Java SDK 8.52. In my case, I previously had 8.51 installed and when I installed version 8.52, Unity didn’t pick up on the changes. To fix this: Go to Edit > Preferences > External Tools under Android, select JDK and choose the path to your newest JDK file. For me, on my window machine, it was located at C:\Program Files\Java\jdk1.8.0_152 With all of this done, hopefully, you should be able to successfully build and run the GvrDemo on your phone + Google Cardboard if you have one. Conclusion Hopefully, this was a useful guide to getting your Android device set up to play the scene. Leave a comment if you run into problems and I’ll try to help and update this article with any new information. On a different note, it’s truly amazing playing with VR on our own mobile device. Just playing the VR game from Unity was interesting, but words can’t describe how much more realistic and interesting it becomes until you strap your phone onto your face! I think at this point, we have a good understanding of the basics and what is and isn’t possible with the Google Cardboard now. Tomorrow we’re going to look and see how we can incorporate the VR SDK into our simple FPS game to see how our game fairs in VR! Day 34 | 100 Days of VR | Day 36 Home
  9. Now that we have made a conscious decision to work in VR, today I finally had the chance to play around with VR in Unity. Today we’re going to explore setting up and using Google’s VR SDK. You might think that setting up VR would be an extremely complex process, but after going through the process, I can say that starting out is simpler than you would think! Here’s our objective for today: Setting up support for Google Cardboard on Unity Going through Unity’s Tutorial Let’s get started! Step 1: Setting up Google Cardboard on Unity For today, I’ll be going through Google’s documentation for setting up VR on Unity. The nice thing about the Google VR SDK is that we can re-use most of the prefabs and scripts that are used with Google Cardboard and use them with Google Daydream. That’s 2 different platforms for the price of one. Today I’ll be following Google’s official documentation on getting started on Unity. Step 1.1: Install Unity At this point, I’m going to assume that we all have an older version of Unity (5.6+). To support being able to run our VR App, we’re going to need to Android Build Support, if you don’t have that installed already, re-download Unity and during the installation process, choose to include Android Build Support. Step 1.2: Adding the Google VR SDK After we have Unity set up correctly with Android Build Support, we need to get Google’s VR assets. Download Google’s VR SDK here. We’re looking to download the .unitypackage. After we have the package downloaded, it’s time to add it to a Unity project. For our case, we’re going to create a new project to play around with. In Unity create a New Project (File > New Project…) Once inside our new project, import everything from the package that we downloaded. In Unity we can import by going to Assets > Import Package > Custom Package. Step 1.3: Configuring Unity to Run VR Now we have imported everything we need, the last thing to do is to change some of our settings in Unity so we can run our game. Change Our Build Setting to Build and Run Android Applications The first thing we need to do is get Unity to run our game project on an Android platform. Open the Build Settings by going to File > Build Settings. Select Android and hit Switch Platform Wait for the game to finish re-packaging our assets for our new platform Change Our Player Settings to Support VR The next thing we need to do is to change our Player Settings so that we can support the specific VR SDK that we want. In our case, it’s going to be Google Cardboard. In Build Settings, next to Switch Platform, we have Player Settings, select it. In Player Settings, enable Virtual Reality Supported and then add Cardboard to our Virtual Reality SDKs Finally, in Minimum API Level, select API level 19 for the minimum Android version the device the players must have. Google Cardboard requires a minimum of level 19 and the Google Daydream Viewer requires a minimum of level 24. Once we have everything installed, we can finally get started on taking a first look at working with VR! Step 2: Looking Through the Unity Tutorial Now that everything is configured, we can officially start looking through Google’s SDK Basics. I went through the SDK basics while also going through the GVRDemo scene. In our new project go to Assets > GoogleVR > Demos > Scenes and open GVRDemo Google provides prefabs and scripts that will take care of the VR features for you. These are all located in Assets > GooglveVR > Prefab and Scripts. Here’s a breakdown of what they and the script attached to them do: GvrEditorEmulator prefab– Allows us to control our camera like how we might control it with our headset. Hold on to the alt button to rotate your view around the camera. GvrControllerMain prefab – Gives us access to the Daydream controller which we can implement actions with Google’s controller API to interact with the game GvrEventSystem prefab – Enables us to use Google’s input pointer system. Specifically, how our gaze/controller interacts and selects objects. GvrPointerGraphicRacyater script – This script is like a normal Graphic Raycaster that we would attach on to a UI canvas so that we can interact with our UI using our input devices (gaze or controller) GvrPointerPhysicsRaycaster script – This script shoots out a raycast directly in the middle of our screen to select something when we decide to click. We should attach this to our main camera. We must also attach Unity’s event system on each object we want to interact with when we select them. GvrControllerPointer prefab – This is the Daydream’s controller. It gives us an arm asset to imitate our controller. This prefab must the sibling of our Main Camera object where we attached our GvrPointerPhysicsRaycaster GvrReticlePointer prefab – This is the Google Cardboard’s gaze controller. It creates a dot in the middle of our screen which we use to select objects that are in the game. For this prefab to work we must make it a child of the Main Camera game object. There are quite a couple of other prefabs and scripts, but on the high level, these are the basics we’ll need to make a VR game. Let’s see this in action with the GvrDemo scene! Step 2.1: Looking at the demo scene When we open up GvrDemo, here’s what we see: I suggest that you explore around the scene and see the objects in our hierarchy, but on the high-level summary, here’s what we have in our hierarchy that’s relevant to just the Google Cardboard (because it has Daydream assets too) GvrEditorEmulator for us to emulate head movement in VR GvrEventSystem for Unity to detect our VR inputs when we select an object Inside Player > Main Camera, we have our GvrPointerPhysicsRaycaster script which allows us to use Google’s raycasting system for 3D objects Inside the Floor Canvas game object, we have the GvrPointerGraphicRacyate for us to interact with the UI. Finally, inside Player > Main Camera > GvrReticlePointer, we have our gaze cursor for Google Cardboard that we use to interact with the game world. The main point of this game is to click on the cube that appears in the game. When we click on the cube, it’ll be randomly moved somewhere else in the game. The interesting part of all of this is how we can trigger the code with our Gaze. Let’s look at the Cube and Unity’s Event Trigger system. The Event Trigger System is a way for Unity to recognize any action taken on the game object that the Event Trigger is registered onto. An action is something like: OnPointerClick OnPointerEnter OnPointerExit In our example, OnPointerClick will be triggered whenever we click on an object that has the Event Trigger attached to it. Here’s the teleport script: // Copyright 2014 Google Inc. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. using UnityEngine; using System.Collections; [RequireComponent(typeof(Collider))] public class Teleport : MonoBehaviour { private Vector3 startingPosition; public Material inactiveMaterial; public Material gazedAtMaterial; void Start() { startingPosition = transform.localPosition; SetGazedAt(false); } public void SetGazedAt(bool gazedAt) { if (inactiveMaterial != null && gazedAtMaterial != null) { GetComponent<Renderer>().material = gazedAt ? gazedAtMaterial : inactiveMaterial; return; } GetComponent<Renderer>().material.color = gazedAt ? Color.green : Color.red; } public void Reset() { transform.localPosition = startingPosition; } public void Recenter() { #if !UNITY_EDITOR GvrCardboardHelpers.Recenter(); #else GvrEditorEmulator emulator = FindObjectOfType<GvrEditorEmulator>(); if (emulator == null) { return; } emulator.Recenter(); #endif // !UNITY_EDITOR } public void TeleportRandomly() { Vector3 direction = Random.onUnitSphere; direction.y = Mathf.Clamp(direction.y, 0.5f, 1f); float distance = 2 * Random.value + 1.5f; transform.localPosition = direction * distance; } } We can ignore what the code does, but the important thing that I want to bring attention to are the public functions that are available: SetGazedAt() Reset() Recenter() TeleportRandomly() Where are these called? Well, if you look back at our Event Trigger that’s created in Cube, we set 3 event types: Pointer Enter Pointer Exit Pointer Click Then whenever any of these events occur, we call our public function. In this example, when we look at our cube, we’ll trigger the Pointer Enter event and call the SetGazedAt() function with the variable gazedAt to be true. When we look away, we trigger the Pointer Exit event and call the SetGazedAt() function with gazedAt to be false. Finally, if we were to click on the cube, we would trigger the Pointer Click event and call TeleportRandomly() to move our cube to a new location. Conclusion It’s surprising how un-complex this whole process is so far! I’m sure there are a lot more things to consider once we dive deeper into Unity, however for today, I think the progress we have made is sufficient. Tomorrow, we’re going to look at how we can get the demo app to run on a phone that supports a Google Cardboard (which I assume at this point is 99% of you guys here) Day 33 | 100 Days of VR | Day 35 Home
  10. S.Korea Indie Game This video is a real alpha gameplay video. LAST HILLS is a work that I am developing with being inspired by horror movies, especially, Conjuring, and Evil Dead. LAST HILLS will be a game where you can enjoy high quality, real, and fear in mood that you are in the game. LAST HILLS will become a more realistic horror game through VR. Thank you! If you want to know more about Last Hills follow us on Facebook : https://www.facebook.com/Lasthills/ Twitter : https://twitter.com/LastHillsGame Official website : https://www.lasthillsgame.com/
  11. This month, the Sydney Game Engine Developers Meetup has organised two speakers to talk about Native VR and Scapy. Join fellow developers for a night of engaging talks on Native virtual reality, get some professional insights on Python and meet other brilliant minds in our industry over pizzas and drinks! The growing community has fantastic people from a vast variety of development background join us in the recent months. The level of engagement and networking is at an all-time high and I really do wish to keep this up. See you there! Link to event: This month, the Sydney Game Engine Developers Meetup has organised two speakers to talk about Native VR and Scapy. Join fellow developers for a night of engaging talks on Native virtual reality, get some professional insights on Python and meet other brilliant minds in our industry over pizzas and drinks! The growing community has fantastic people from a vast variety of development background join us in the recent months. The level of engagement and networking is at an all-time high and I really do wish to keep this up. See you there! Link to event: https://www.meetup.com/Sydney-Game-Engine-Developers/events/243542793/
  12. This month, the Sydney Game Engine Developers Meetup has organised two speakers to talk about Native VR and Scapy. Join fellow developers for a night of engaging talks on Native virtual reality, get some professional insights on Python and meet other brilliant minds in our industry over pizzas and drinks! The growing community has fantastic people from a vast variety of development background join us in the recent months. The level of engagement and networking is at an all-time high and I really do wish to keep this up. See you there! Link to event: This month, the Sydney Game Engine Developers Meetup has organised two speakers to talk about Native VR and Scapy. Join fellow developers for a night of engaging talks on Native virtual reality, get some professional insights on Python and meet other brilliant minds in our industry over pizzas and drinks! The growing community has fantastic people from a vast variety of development background join us in the recent months. The level of engagement and networking is at an all-time high and I really do wish to keep this up. See you there! Link to event: https://www.meetup.com/Sydney-Game-Engine-Developers/events/243542793/ View full story
  13. Epic Games today announced it has teamed with NVIDIA to deliver enterprise-grade solutions to help application developers create better, more immersive VR experiences. Enterprise businesses have been among the most enthusiastic adopters of VR to improve production workflows, visualize CAD designs, boost safety and training, and create better experiences for customers. This requires sophisticated and reliable VR platforms and tools. To ease enterprise VR adoption, Epic has integrated NVIDIA Quadro professional GPUs into the test suite for Unreal Engine 4, the company's real-time toolset for creating applications across PC, console, mobile, VR and AR platforms. This ensures NVIDIAtechnologies integrate seamlessly into developers' workflows, delivering excellent results for everything from CAVEs and multi-projection systems through to enterprise VR and AR solutions. "With our expanding focus on industries outside of games, we've aligned ourselves ever more closely with NVIDIA to offer an enterprise-grade experience," said Marc Petit, general manager of the Unreal Engine Enterprise business. "NVIDIA Quadro professional GPUs empower artists, designers and content creators who need to work unencumbered with the largest 3D models and datasets, tackle complex visualization challenges, and deliver highly immersive VR experiences. By combining NVIDIA hardware with Unreal Engine, developers are ensured excellent performance and productivity." One project that has driven this effort is Epic's collaboration with GM and The Mill on "The Human Race," a real-time short film and mixed reality experience featuring a configurable Chevrolet Camaro ZL1, which was built using NVIDIA Quadro pro graphics. Another company that has benefitted from Epic's collaboration with NVIDIA is Theia Interactive a pioneer of real-time architectural visualization. Stephen Phillips, CTO of Theia, added, "NVIDIA Quadro provides an incredible amount of computing power for running beautiful VR experiences within Unreal Engine. Having 24GB of VRAM allows us to use hundreds of high-resolution, uncompressed lightmaps for incredible real-time architectural visualizations that rival the quality of the finest offline renderers." "As the market for VR and AR content expands, professional developers in industries such as automotive, architecture, healthcare and others are using Unreal Engine to create amazing immersive experiences," said Bob Pette, vice president of Professional Visualization at NVIDIA. "Unreal, from version 4.16, is the first real-time toolset to meet NVIDIA Quadro partner standards. Our combined solution provides leaders in these markets the reliability and performance they require for the optimum VR experience." Learn more at http://unrealengine.com/enterprise.
  14. Epic Games today announced it has teamed with NVIDIA to deliver enterprise-grade solutions to help application developers create better, more immersive VR experiences. Enterprise businesses have been among the most enthusiastic adopters of VR to improve production workflows, visualize CAD designs, boost safety and training, and create better experiences for customers. This requires sophisticated and reliable VR platforms and tools. To ease enterprise VR adoption, Epic has integrated NVIDIA Quadro professional GPUs into the test suite for Unreal Engine 4, the company's real-time toolset for creating applications across PC, console, mobile, VR and AR platforms. This ensures NVIDIAtechnologies integrate seamlessly into developers' workflows, delivering excellent results for everything from CAVEs and multi-projection systems through to enterprise VR and AR solutions. "With our expanding focus on industries outside of games, we've aligned ourselves ever more closely with NVIDIA to offer an enterprise-grade experience," said Marc Petit, general manager of the Unreal Engine Enterprise business. "NVIDIA Quadro professional GPUs empower artists, designers and content creators who need to work unencumbered with the largest 3D models and datasets, tackle complex visualization challenges, and deliver highly immersive VR experiences. By combining NVIDIA hardware with Unreal Engine, developers are ensured excellent performance and productivity." One project that has driven this effort is Epic's collaboration with GM and The Mill on "The Human Race," a real-time short film and mixed reality experience featuring a configurable Chevrolet Camaro ZL1, which was built using NVIDIA Quadro pro graphics. Another company that has benefitted from Epic's collaboration with NVIDIA is Theia Interactive a pioneer of real-time architectural visualization. Stephen Phillips, CTO of Theia, added, "NVIDIA Quadro provides an incredible amount of computing power for running beautiful VR experiences within Unreal Engine. Having 24GB of VRAM allows us to use hundreds of high-resolution, uncompressed lightmaps for incredible real-time architectural visualizations that rival the quality of the finest offline renderers." "As the market for VR and AR content expands, professional developers in industries such as automotive, architecture, healthcare and others are using Unreal Engine to create amazing immersive experiences," said Bob Pette, vice president of Professional Visualization at NVIDIA. "Unreal, from version 4.16, is the first real-time toolset to meet NVIDIA Quadro partner standards. Our combined solution provides leaders in these markets the reliability and performance they require for the optimum VR experience." Learn more at http://unrealengine.com/enterprise. View full story
  15. Microsoft will deliver the closing keynote at the Develop:VR conference and expo taking place on 9 November at Olympia London. The keynote entitled Windows Mixed Reality: Developing for the Virtual Continuum, to be presented by Pete Daukintis, Technical Evangelist and Mike Taulty, Developer Evangelist will focus on bringing existing developers skills and apps to Windows Mixed Reality devices such as HoloLens and forthcoming immersive headsets. The session will review the current Windows Mixed Reality landscape, followed by an in-depth demonstration on where they excel, as well as teaching prospective developers what they need to know regarding platforms, tools, frameworks, runtimes, languages and resources that are available to them. Sessions also confirmed today for Develop:VR: Funding Question Time A panel consisting of Martin de Ronde, creative director at Force Field VR, Dave Ranyard, founder of Dream Reality Interactive, Thomas Gere from Realities Centre and chaired by Ella Romanos, founder of Rocket Lolly Games will discuss their experiences of getting their latest projects funder, the new trends in funding and how to think outside the box on making your next funding round a success. VR and Moral Panic New forms of media can sometimes trigger sensationalised fear and worry from the mass media and then, the public – a phenomena that sociologists describe as ‘moral panic’. VR is the perfect moral panic case study. Catherine Allen, VR Curator and Producer will demonstrate how the knowledge of a new technology’s very existence can bring out society’s deep down hopes and fears about humanity’s future.
  16. Microsoft will deliver the closing keynote at the Develop:VR conference and expo taking place on 9 November at Olympia London. The keynote entitled Windows Mixed Reality: Developing for the Virtual Continuum, to be presented by Pete Daukintis, Technical Evangelist and Mike Taulty, Developer Evangelist will focus on bringing existing developers skills and apps to Windows Mixed Reality devices such as HoloLens and forthcoming immersive headsets. The session will review the current Windows Mixed Reality landscape, followed by an in-depth demonstration on where they excel, as well as teaching prospective developers what they need to know regarding platforms, tools, frameworks, runtimes, languages and resources that are available to them. Sessions also confirmed today for Develop:VR: Funding Question Time A panel consisting of Martin de Ronde, creative director at Force Field VR, Dave Ranyard, founder of Dream Reality Interactive, Thomas Gere from Realities Centre and chaired by Ella Romanos, founder of Rocket Lolly Games will discuss their experiences of getting their latest projects funder, the new trends in funding and how to think outside the box on making your next funding round a success. VR and Moral Panic New forms of media can sometimes trigger sensationalised fear and worry from the mass media and then, the public – a phenomena that sociologists describe as ‘moral panic’. VR is the perfect moral panic case study. Catherine Allen, VR Curator and Producer will demonstrate how the knowledge of a new technology’s very existence can bring out society’s deep down hopes and fears about humanity’s future. View full story
  17. NVIDIA Releases VRWorks for UE4.17

    NVIDIA has released the UE4.17 integration of VRWorks at the VRWorks Github. VRWorks enables developers with VR technology in their Unreal Engine 4 games through a suite of APIs, libraries, and engines provided through custom UE4 branches on Github. Learn more at https://developer.nvidia.com/nvidia-vrworks-and-ue4.
  18. NVIDIA Releases VRWorks for UE4.17

    NVIDIA has released the UE4.17 integration of VRWorks at the VRWorks Github. VRWorks enables developers with VR technology in their Unreal Engine 4 games through a suite of APIs, libraries, and engines provided through custom UE4 branches on Github. Learn more at https://developer.nvidia.com/nvidia-vrworks-and-ue4. View full story
  19. The term “virtual reality” has been lurking around for sometimes now. If you’ve not got your head stuck in the sand, you’d have likely heard about that the technology is tending towards mainstream adoption. As with most innovations in the tech world, consumer virtual reality devices are expected to make ways in 2017. So there is no better time to get familiarized with the new trend of virtual discovery in the industry. What is Virtual Reality? With the use of computer software and hardware, virtual reality as it stands is an imminent technology that is designed to give users the experience of a Virtual Environment (VE). Originally, it was perceived as a digital space created by donning special computer equipment accessible by humans. With this technology, VR app development companies can create information that can be easily accessed by humans (users). VR provides users with a dynamic and immediate means of seeing and experiencing information differently. Basically, the technology seeks to create a simulated, 3-dimensional world around users in which they can easily interact with environments, people, and objects. Typically, these VR app development companies try to present 3D life-sized images around users with support of audio devices. Generally, the user’s head or eye movements help to modify the perspective. Along with the computers, there are several devices that can be used to create a virtual environment. When it comes to virtual reality, it is essential to note that not all wearable screen technologies are designed to function the same way. They are not created equal to start with. To this end, wearable screens are steadily evolving into three distinct domains, which are given below; - Virtual reality (VR) - Mixed reality (MR) - Augmented reality (AR) The term virtual reality was first introduced by computer scientist Jaron Lanier in the year 1987. And since then, it has been serving as a kind of umbrella term for the reality space. Unfortunately, with the rollout of all the impending products, there is every tendency for virtual reality to become the least popular in the industry. How VR can be employed In a bid to replace the real world environment with the digital version, a VR app development company will always see to immerse the human senses in the virtual environments. Basically, a user will have to use special goggles, earphones, and gloves so as to be able to enter into the visual environment. Immersion is one common process employed by app development companies to shut out all cues from the physical world. This experience is established to help users lose themselves in the VE. Depending on the mobile app platform, a virtual environment can be developed on different extents. Ranging from a smartphone screen to an external display unit, and to a fully Immersive Virtual Environment (IVE). Unlike the traditional media such as video games and televisions, a VR app development company may choose to make the experience more immersive and interactive by tracking and rendering the whole process. Here are some of the major components of a virtual environment - In order to effectively help users specify their interactions with virtual objects, there is the need for certain enabling devices such as tracked gloves with pushbuttons. - A sound system capable of producing simulated sound fields as well as high-quality directional sounds. - A database construction and maintenance system that is capable of creating and maintaining a comprehensive and practical model of the virtual world. - A tracking system that regularly indicated the position and orientation of movements established by the user. - The graphics rendering system that is designed to produce an accurate display of the consistently changing images at 20-30 frames per second. - Visual displays that place the user in the virtual environment and restricts sensory impressions emanating from the real world which might be contradictory. How Virtual Reality Works When it comes to virtual reality app development, it is very important to understand how things are done and their consequential effect. To fully understand how VR works, try to play the ‘Counter Strike’ game. You will discover that along with the interfaced input-output devices, there is another crucial element that runs with the help of the computer system which is the software program. VR app development cannot be complete without writing code for characters and environments. Based on the written code, every character within the VR environment is expected to behave very closely to reality. Both characters and environments are facilitated by the code so as to ensure smooth interaction with those other characters that are being manipulated by the input devices. Note that the processor which is responsible for handling the input-output devices helps to interpret the code. When it comes to building games through VR app development, it is very needful to understand that along with a high-performance processor, a number of advanced input and output devices is also required to increase immersion. However, this is quite similar to the working or more immersive virtual reality environment. In the case of a more immersive virtual reality, the processes are quickly executed by the process base on the command provided by the user while the output is rendered by users in such a way that they feel themselves to be part of the environment and its objects. VR app development basically seeks to make use of any of the display, projection-based or screen based to superimpose the real-life environment with 3D images. A display methodology that enables users to clearly witness 3D scenarios is employed to achieve the feat. Along with the sound system, a head-mounted device or a high-quality display screen in terms of color and resolution can be used to support the screen based virtual environment. Apart from the 3D visualization components, other gears will be used as input devices such as the joystick, gesture recognition system, finger trackers, head tracking sensors, microphone, and keyboard. Users can feel and see how they control objects and environments on screen by pressing keys on the keyboard, move their head or move the joystick or gear. It is imperative to know that every input is processed by a high-speed powerful processor. Note that the interface to input devices connect to the system such as keyboard and mouse is supported by an Application Programming Interface (API) .
  20. So basically, this is about the future of VR Gaming. Mainly, if Nervegear was real and the ability to recreate realistic environments and haptics. So some of you may have watched or read or heard about an anime and manga called Sword Art Online. In it, there is a device called the Nervegear, now the catch in the show was that playing SAO, if you died, you die in real life. Because of the way it is programmed. But in real life, that's not going to happen. Basically, here is an Idea for a real life version of the NerveGear and what it's capabilities and games would be in real life. So I have been hearing that to experience all senses in the NerveGear would be difficult, and the idea is difficult as well. But I am partial to the Idea, as I absolutely love VIdeo Games and famous Franchises. So imagine this, a version of Pokemon for the Nerve Gear, the entire region of Kanto is simulated with Tens of Millions of NPCs and hundreds of gyms including 8 Large Shopping-Mall like gyms where you get the gym badges. The game is cel-shaded like in the Pokemon Games to distinguish it from reality, yet all the senses can be experienced in it. The thing is, you don't want it too realistic if it's Pokemon because even though some Pokemon look good realistic, some look creepy realistic. So it would be an evolution from Pokemon Go in that sense. The capabilities, I reckon, is to create a simulation of the earth with Billions of NPCs, possibly even larger. Possibly entire Galaxies like in No Man's Sky and Elite. Basically I want a machine that can push it to limits, do you think Quantum Computing will help? The thing is, is Virtual Reality must mean that everything you dream of is possible to do. Including explore the galaxy or Universe like in No Man's Sky(Which may get VR Support sooner than later). To Live in a world different from ours that is realistically or unrealistically populated. Like in Ready Player One or Sword Art Online, "A World of Pure Imagination" as Willy Wonka and the Ready Player One trailer would say. I'm sorry for rambling, but it's just my theory(A GAME THEORY) on the future of video games.
  21. Joe air-taps Dan’s picture to video call him. Dan picks the call and appears, in front of Joe, Dan’s life-size hologram, floating in the air. In very Sci-fi style, Joe drags the hologram, rests it on a table and pins it. The call wasn’t very different from a regular Skype video call, except it was taking place in Augmented Reality (AR) between Microsoft HoloLens, rather between PCs or Smartphones. The Head Mounted Display (HMD) wore by Joe made the audience believe that he is about to demo some upcoming Virtual Reality (VR) tech by Microsoft. The demo ended in huge applause and gave the audience a sneak-peek what video calls in future will look like. VR and AR are two futuristic technologies that are going to change the way we, humans, perceive technology. It’s natural of App developers to look at these technologies with great hope. While AR technology has been in work for a long time now and is relatively a common place in mobile apps, today, the credit of reviving VR goes to 2012’ Kickstarted project “Oculus Rift: Step Into the Game” by then unknown startup Oculus. Facebook later acquired Oculus for $2 billion and inspired (perhaps forced) Google to make inroads to VR technology. Google Vs Facebook: the next platform war Google, rather than developing a standard PC-connected VR device like Oculus Rift, decided to leverage on the well-established it controls, which led us to Google Cardboard and, later, Daydream. Google’s setting involved a VR kit consist of a HMD and a smartphone. Google released three SDKs for developing apps for cardboard on various platforms: Android, Unity Gaming, and iOS. The SDKs triggered the first set of VR applications developed for smartphones and the world hasn’t looked backwards since. Perspective Reality Cardboard’s successor Daydream, owing to only a handful of Daydream-ready phones and the VR headset Daydream View costing many times the cardboard, is far from a success. But it’s the only native VR SDK available for a mobile platform with Apple conspicuous by its absence in this field, bringing Facebook and Google on the verge of a VR supremacy war. In case you were wondering, unlike its predecessor, Daydream doesn’t support iOS, at least not yet. Is it Apple vs Google again? Both Google Tango and Apple ARKit look promising but are yet to reach their full potential. This might give rise to another platform war between the two tech giants. Apple ARKit supports every iPhone 6s and 7 out there and is a clear winner here. Google Tango at this stage supports a couple of handsets by Asus and Lenovo, neither Pixel-s nor Galaxy-ies. Nevertheless, iOS and Android app developers determined to include either of the technologies in their upcoming apps in a pursuit of futureproofing them have a lot of paths to take at least when they are thinking AR. Daydream may not look like an overly capable project Google hyped for after all; it’s the only feasible platform to develop VR apps on. There is no need to look elsewhere. VR and AR can add value to any app regardless of its category. But how to choose between these two when developing an app or a GAME? Games It’s hard to decide between the two when developing games. AR and VR both tend to blur the lines between real and virtual world. However, VR looks like the missing block in the games that are drawing up on ‘reality’. First Person Shooters (FPSs), today, are growing closer to reality with real life graphics, spine-chilling sound effects and, frantic animations. On top of that, FPSs increasingly include AI engines and physics engines to give a gamer a perception of reality. However, all the action take place in front of a screen placed at a distance from a viewer, which leads to substantial loss of quality by the time the images travels to the viewer’s eyes, broadening the gap between perception and reality. In VR, the screen inside the HMD is placed directly on viewers eyes, giving the user a perception that he is not playing the game, but in it. Spider-Man Homecoming VR Experience is a fun and thrilling first-person game if you have a capable PC and either an Oculus Rift or HTC Vive. Or else you can try VR Roller Coaster on Google Cardboard. If your game needs to interact with the real-world locations (think Pokémon Go), AR is what you need. Otherwise, VR is the way to go. Video Streaming apps For the reason described above, a piece of video content provided taken in 360-view is sure to leave your viewers awestruck. They can move around, revolve wearing the HMD and can actually see what is happening behind the action. VR doesn’t look like very great an option in this category of apps. Apple Developers looking at ARKit with great hope. Sorry! Video Calling The Skype call made on HoloLens I believe is the best rendition of Augmented Reality to date, not as a gimmick but as a technology that actually makes, otherwise, boring and dull video calls interesting and useful at the same time. AR is suitable in case of Video Calling apps because you need to see the world on either side of the call. VR will cut you on your side of call. I am not saying VR is not happening in Video Calling at all but the HoloLens demo, suddenly, makes developing a standalone video calling apps so much sense. Imagine interacting with the world around the person you’re on a call with, annotating objects of your interest and zooming them in and out while the call is still running and he is interacting with yours.
  22. There are studios selling applications which is just copying any 3Dgraphic content and regenerating into another new window. especially for CAVE Virtual reality experience. so that the user opens REvite or CAD or any other 3D applications and opens a model. then when the user selects the rendered window the VR application copies the 3D model information from the OpenGL window. I got the clue that the VR application replaces the windows opengl32.dll file. how this is possible ... how can we copy the 3d content from the current OpenGL window. anyone, please help me .. how to go further... to create an application like VR CAVE. Thanks
  23. Still just a few years ago, you could not play a video game with your apparent body movements from a distance. It just happened in the span of last few years, and now you can fight with your game avatar just with real body movements. You have entered the era of virtual reality. More than any other industry VR came as a harbinger of change to the gaming industry. In the recent past, we have seen the launch of an array of VR devices and gadgets including VR headsets, VR game playing shoe, etc. Is this the ultimate promise we can expect from VR in games? No, rather it is just the beginning of VR which soon going to leave behind these stand-alone devices and hardware to become more compatible and affordable. As a game development company, you must know how virtual reality is changing gaming. More VR gadgets at more affordable price When talking about the imminent change that virtual reality games are supposed to bring to the gaming world, we must take into account the huge competition among the VR device brands and the sloping price as well as increasing affordability of these devices. Yes, VR devices represented for too long an expensive game niche that could have only been afforded by rich gamers. But with more brands hitting the VR game scene, the price continued to drop making it affordable for more people. But still, they are not mass products within easily affordable reach of everyone but is slowly tending to be so. Until a few years back you could have told all the major names in the market of VR gadgets and headsets that include pioneers such as Oculus Rift, Sony PlayStation, and HTC Vive. Now you have more than a dozen players who have unleashed their sophisticated VR gadgets or just preparing to do so. This gave rise to fierce competition in terms of feature offerings as well as price. While virtual reality is defined by its ability to take you to an entirely different world, augmented reality (now more commonly referred to as “mixed” reality) augments the one in which you’re living. With Microsoft’s Hololens, a self-contained computer rather than one powered by a separate device, you can watch an actual wall deform in real time and begin to spew out spiders. And as you walk around, that hole in the wall stays locked in. It is, as far as your brain is concerned, there. After Microsoft ambitious Hololens the VR scene is eagerly awaiting another major launch and it's nothing but Google Daydream which is expected to arrive with a single handed controller. Most importantly, some of the newest mobile phones have arrived with VR capability built in. Axon 7 and Moto Z are good examples of mobile handsets with built-in VR support. Deeper into the simulated reality Virtual reality gaming was conceived to allow us playing games in a simulated atmosphere or to allow us to play a digital game with real life interaction and environment. A game environment transporting the gamer to a life-like virtual reality was the quintessential aspect of VR games. It was limited to the most popular game niches involving strategy and actions. But with the VR games getting popular other game niches where game environment plays a vital role are going to be transported to virtual reality. Google Daydream is expectedly going to make online VR casino games possible. With this trend settling in later on we can see other arena games and e-sports also coming with their VR versions reaping the advantage of the technology to drive more engagement. VR will continue to dig deeper into simulated reality to transport many games with a life-like environment. Social and collaborative gaming is going to take over Social gaming is already a robust and most popular trend in the recent times. Now with VR games becoming an everyday phenomenon for gamers, it is bound to hit the social space as well. Already most of the top game titles allow players to collaborate with their friends and other players online. Now, VR games allowing the same collaborative playing will only help VR reach more players. With the huge promise and possibility of collaborative virtual reality games, a whole new breed of games can soon sweep the web. With VR devices and headsets continuing to be more affordable for people, we can see a huge upsurge in the type of games in which collaboration and social interaction play a vital role. Time for interactive gaming Finally, virtual reality in the gaming world is no longer going to be kept aside as a niche and special gaming technology. Instead of being a niche gaming technology for few it is going to string together many established and upcoming gaming trends. Other technologies that also stretches the sense of reality and helps to broaden interaction of games with the surrounding reality are going to be a part of this offering. This means we can expect some devices and games to come equipped with both VR and AR capabilities. This new approach widely being dubbed as mixed reality will push the horizon of VR and AR games further. They are not going to be two separate technologies anymore, but going to help gaming interactions even more by allowing a mix of virtual and augmented reality environment. In future, you may not need a headset anymore to play a VR game. A game playing screen can be created anywhere while you play the game with your gestures instead of touch. You can play a game right on your coffee table and play it single-handedly with gestures while still sipping your coffee. With VR and AR together offering a mixed game reality, that day is not far away. To conclude, So, the gaming world is really going to experience a revolution with so many things happening and a lot more waiting with virtual reality. The virtual reality which until now has mainly been limited to sophisticated and high-funda headsets will soon become an everywhere gaming reality blurring the division between reality and game environment further.
  24. Stuart Whyte, director of VR product development for Sony London Studios will deliver this year’s keynote for Develop:VR. Whyte, who has close to 30 years’ experience in the video games industry, starting life as an adventure columnist for Amstrad Action magazine, will open Develop:VR with his keynote entitled, ‘Taking VR to the Next Level – A Case Study in AAA Games Development’. In his keynote, Whyte will discuss the current state of the VR industry, the challenges and opportunities of AAA VR development and suggest potential solutions to maximise success for any sized studio. As well as the keynote, Tandem Events also announced the following sessions, with more to be revealed: Improve your brain – the real value of VR/AR gaming Faviana Vangelius, SVRVIVE Pioneers in the Sesert – The Reality of Developing for Virtual Reality Andrew Willans, CCP Games Love your Limitations: Defining Art for Mobile VR Anna Hollinrake, Climax Studios Collaborating with Brands to Create Magical VR Brynley Gibson, Kuju Serious VR, Making Real Money Tanya Laird, Digital Jam Drop Deadline – Delivering a Visually-Excellent, 60fps, Narrative Mobile Shooter to a Fixed Deadline with a Small Team James Horn, Pixel Toys Getting Up Close and Virtual with the Automotive Industry: Using VR for the Right Reasons James Watson, Imagination Haptics and VR – Touching Virtual Worlds Anders Hakfelt, Ultrahaptics Develop:VR is a one day conference and expo focusing on the commercial opportunities that Virtual and Augmented Reality present for today’s game developers and highlighting the tools and techniques needed to produce top selling VR and AR content. Develop:VR takes place on 9 November at Olympia London and delegate passes can be bought at a Super Early Bird rate, a saving of £100, until 20 September at www.developvr.co.uk.
  25. Stuart Whyte, director of VR product development for Sony London Studios will deliver this year’s keynote for Develop:VR. Whyte, who has close to 30 years’ experience in the video games industry, starting life as an adventure columnist for Amstrad Action magazine, will open Develop:VR with his keynote entitled, ‘Taking VR to the Next Level – A Case Study in AAA Games Development’. In his keynote, Whyte will discuss the current state of the VR industry, the challenges and opportunities of AAA VR development and suggest potential solutions to maximise success for any sized studio. As well as the keynote, Tandem Events also announced the following sessions, with more to be revealed: Improve your brain – the real value of VR/AR gaming Faviana Vangelius, SVRVIVE Pioneers in the Sesert – The Reality of Developing for Virtual Reality Andrew Willans, CCP Games Love your Limitations: Defining Art for Mobile VR Anna Hollinrake, Climax Studios Collaborating with Brands to Create Magical VR Brynley Gibson, Kuju Serious VR, Making Real Money Tanya Laird, Digital Jam Drop Deadline – Delivering a Visually-Excellent, 60fps, Narrative Mobile Shooter to a Fixed Deadline with a Small Team James Horn, Pixel Toys Getting Up Close and Virtual with the Automotive Industry: Using VR for the Right Reasons James Watson, Imagination Haptics and VR – Touching Virtual Worlds Anders Hakfelt, Ultrahaptics Develop:VR is a one day conference and expo focusing on the commercial opportunities that Virtual and Augmented Reality present for today’s game developers and highlighting the tools and techniques needed to produce top selling VR and AR content. Develop:VR takes place on 9 November at Olympia London and delegate passes can be bought at a Super Early Bird rate, a saving of £100, until 20 September at www.developvr.co.uk. View full story