• Advertisement

Search the Community

Showing results for tags 'VR'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Categories

  • Audio
    • Music and Sound FX
  • Business
    • Business and Law
    • Career Development
    • Production and Management
  • Game Design
    • Game Design and Theory
    • Writing for Games
    • UX for Games
  • Industry
    • Interviews
    • Event Coverage
  • Programming
    • Artificial Intelligence
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Engines and Middleware
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
  • Archive

Categories

  • News

Categories

  • Audio
  • Visual Arts
  • Programming
  • Writing

Categories

  • GameDev Unboxed

Categories

  • Game Dev Loadout

Forums

  • Audio
    • Music and Sound FX
  • Business
    • Games Career Development
    • Production and Management
    • Games Business and Law
  • Game Design
    • Game Design and Theory
    • Writing for Games
  • Programming
    • Artificial Intelligence
    • Engines and Middleware
    • General and Gameplay Programming
    • Graphics and GPU Programming
    • Math and Physics
    • Networking and Multiplayer
  • Visual Arts
    • 2D and 3D Art
    • Critique and Feedback
  • Topical
    • Virtual and Augmented Reality
    • News
  • Community
    • GameDev Challenges
    • For Beginners
    • GDNet+ Member Forum
    • GDNet Lounge
    • GDNet Comments, Suggestions, and Ideas
    • Coding Horrors
    • Your Announcements
    • Hobby Project Classifieds
    • Indie Showcase
    • Article Writing
  • Affiliates
    • NeHe Productions
    • AngelCode
  • Workshops
    • C# Workshop
    • CPP Workshop
    • Freehand Drawing Workshop
    • Hands-On Interactive Game Development
    • SICP Workshop
    • XNA 4.0 Workshop
  • Archive
    • Topical
    • Affiliates
    • Contests
    • Technical

Calendars

  • Community Calendar
  • Games Industry Events
  • Game Jams

Blogs

There are no results to display.

There are no results to display.

Marker Groups

  • Members

Developers

Developers


Group


About Me


Website


Industry Role


Twitter


Github


Twitch


Steam

Found 77 results

  1. Hello, My Name is Olivier Girardot, I am a music composer and a sound designer. I am working with a friend and great artist Gero Doll on a VR project called DreamArtVR. Gero is doing all the design and programming, while I am creating the Sound. Gero has by far the hardest part. Plus he is not a programmer so he has been struggling a lot to achieve this first level, and there are some imperfections, but I think he did a great job. We would very much appreciate if you could try the demo and give us your honest opinion. We know there are a lot of things to improve, but we hope we are on the right path to create something unique, and interesting for the players. We are also hoping to get some financial help to finish this game and hire a programmer. We are looking forward to reading your impression on this game. Here is the video teaser: And here is the webpage where you can download the playable demo: https://limbicnation.itch.io/dreamartvr Thank you for your time ! Olivier Girardot http://www.ogsoundfx.com http://www.ogmusik.com http://www.limbicnation.com/
  2. TL;DR: I'm a software engineer focused on VR game dev. I'm looking for partners who want to work together as a legitimate game studio making quality 3D games. About me: - I live in Los Angeles, CA - I'm a software engineer and I've built apps from A to Z - In my previous job, I did native mobile app development for numerous Android VR apps - I'm experienced with music composition/production: https://soundcloud.com/ryancomposer/march-of-nobility - I have a blog documenting my journey as a game developer: http://notherverse.com/ - I take care of legal, networking, accounting, marketing and development responsibilities - I use Blender for modeling, Substance Painter for textures, and UE4 for the game engine - I'm experienced with video editing and use Adobe Premier Pro + After Effects - I enjoy grinding on weekdays and hitting the bars on weekends My strongest area is programming. I'm looking for designers and modelers. You don't need any experience at all in the VR space nor any VR hardware. I'm fully open to splitting ownership/profits/royalties based on contribution. Must have portfolio or something to illustrate experience level. Respond to this thread if interested. Thanks! Though I'm already working on a project, I'm open to working on something else that we're all enthusiastic about. Here's my current project, a VR mech game using UE4 (see progress at blog: http://notherverse.com/):
  3. Title: Heroes & Legends Style: MMO & RPG Required Knowledge: Blender, Make Human, Krita, Gimp, PixPlant Stage: Content Creation Description I am breaking this down in to a piece by piece team building effort. I need artists. I am a software engineer by profession. I have spent a few years working on modeling, character design, item design. Currently I am working on a Human with detailed rigging. There are format rules, and you must understand unreal units. 1 unit = 1 centimeter. The start out of the human is 5'6. The character rig is very detailed and will be assembled in Unreal according to specific format guidelines so you must be capable of disseminating the format for skeletal structures I am producing so when this is on Unreal Store we can continue to expand and implement content. I need texture artists, concept artists. Here are some benefits I will be providing. Access to pluralsight.com. I will be providing access to unlimited current books. I will be providing the basic sounds, basic server, and GitHub repositories. If you have imagination and want to contribute to something astounding for the modern day PC, then please, contact me and lets start this. In the long term, on completion of the project, the company will be moving to a focus on military combat simulations similar to Janes years back. So you must be willing to stick with the team in the long term once we move our creations, like world generation, GIS, character creation to strategic combat simulators. This is just a stepping stone. This will be featuring music scores from Versus, and Ivan Torrent paid for up front and sounds from Audio Block. If your interested in what Versus Produces for music, go here: https://www.youtube.com/user/VersusMusicOfficial To give you guys a feel for where I am at in the character development process and the flow, please view the images attached. This game is for the more modern PC, Windows and Linux. The detail of the skeleton is going to reflect capabilities for really extreme motion graphics. The level of detail and character modeling, I am hoping will exceed most current fantasy games. Please look at the right of the image where I have expanded to skeletal structure so you can view the layout of skeletons in their parent relationship and how I am using a format description. if you think you can do this, then please, reply and send me a PM. A little warning, if you can't handle adult themes similar to The Witcher, then this is probably not for you. There is going to be alcohol, drugs, sex, in this game. So be prepared for high definition work.
  4. Today we are pleased to announce the release of Leadwerks Game Engine 4.5. Version 4.5 introduces support for VR headsets including the HTC Vive, Oculus Rift, and all OSVR-based hardware, allowing developers to create both room-scale and seated VR experiences. The Leadwerks virtual reality command set is robust yet incredibly simple allowing you to easily convert your existing 3D games into VR titles. To help get you started the source code for our Asteroids3D game has been updated for VR and is now freely available in the Leadwerks Games Showcase. Leadwerks Game Engine is uniquely well-suited for VR because of its fast performance, ease of use, and the availability of C++ programming for demanding VR games. Several optimizations for VR have been made including combining the rendering of both eyes into a single culling step. The stability and accuracy of Newton Game Dynamics means we can have in-depth physics interactions in VR. A new VR game template has been added to provide common VR features including teleportation locomotion and the ability to pick up and interact with objects in the environment. Visual Studio 2017 We've also upgraded Leadwerks Professional Edition to build with Visual Studio 2017 so you can take advantage of the very latest Visual Studio features. Instructions for upgrading C++ projects from version 4.4 to 4.5 are available here. Other Improvements Added fog settings in editor and into map file format. New joint scripts and enhancements. Updated to Steamworks 1.41 You can pick up Leadwerks Game Engine with a discount during the Steam Winter Sale. About Leadwerks Software Leadwerks Software was founded in 2006 to make game development easy and fun. The company launched Leadwerks Game Engine on Steam in January 2014 and has experienced steady growth, now with over 20,000 paid users. Leadwerks Game Launcher was released as an early access title in September 2015, allowing developers to publish games to Steam Workshop with no submission fee.
  5. Today we are pleased to announce the release of Leadwerks Game Engine 4.5. Version 4.5 introduces support for VR headsets including the HTC Vive, Oculus Rift, and all OSVR-based hardware, allowing developers to create both room-scale and seated VR experiences. The Leadwerks virtual reality command set is robust yet incredibly simple allowing you to easily convert your existing 3D games into VR titles. To help get you started the source code for our Asteroids3D game has been updated for VR and is now freely available in the Leadwerks Games Showcase. Leadwerks Game Engine is uniquely well-suited for VR because of its fast performance, ease of use, and the availability of C++ programming for demanding VR games. Several optimizations for VR have been made including combining the rendering of both eyes into a single culling step. The stability and accuracy of Newton Game Dynamics means we can have in-depth physics interactions in VR. A new VR game template has been added to provide common VR features including teleportation locomotion and the ability to pick up and interact with objects in the environment. Visual Studio 2017 We've also upgraded Leadwerks Professional Edition to build with Visual Studio 2017 so you can take advantage of the very latest Visual Studio features. Instructions for upgrading C++ projects from version 4.4 to 4.5 are available here. Other Improvements Added fog settings in editor and into map file format. New joint scripts and enhancements. Updated to Steamworks 1.41 You can pick up Leadwerks Game Engine with a discount during the Steam Winter Sale. About Leadwerks Software Leadwerks Software was founded in 2006 to make game development easy and fun. The company launched Leadwerks Game Engine on Steam in January 2014 and has experienced steady growth, now with over 20,000 paid users. Leadwerks Game Launcher was released as an early access title in September 2015, allowing developers to publish games to Steam Workshop with no submission fee. View full story
  6. I'm building the VR project template for Leadwerks 4.5. Although you can enable VR in any project, this template is specifically designed to provide some of your most common room-scale VR features: Teleportation movement, which prevents motion sickness. Picking up and throwing objects. (It's actually really fun!) To start with I am creating the art assets for the teleport effect. This is basically what I want: Your controller shoots a beam which ends in an indicator when it hits an upwards-facing slope. Typically this beam will be somewhat arced. Why the curve? This allows you to climb up to areas above you: As always, I am starting with the game assets. I don't believe in using programmer art because it hurts your understanding of what you are trying to create, it's uninspiring, and you will end up writing your code twice once you get the final artwork and realize all the mistakes you made. I started with textures. I know I want a circular indicator on the floor, a misty spinning effect rising off it, and a beam. I'm going to make all my textures grayscale so that I can control the color with the entity color value and dynamically change it in the game. Here are my textures I created in about ten minutes in Paint Shop Pro: The first texture above is clamped along the X and Y axes and the second one is clamp along the Y axis. I am using uncompressed textures for all of these because they have a lot of soft gradients. I created my materials with the following settings, again leaving everything white: In 3ds Max I created my indicator model. It's just a plane with a cylinder on top, with the end caps removed: When I import it into Leadwerks and apply my materials, the model looks like this: I'll show you why I am using uncompressed textures. You can see in this shot the edge of the ring has some ugly artifacts when texture compression is used: Here's a closeup. Not something I want to see in VR: Now I am going to create an instance of the model in the editor and adjust the color. I want a bright blue glowy color. I am setting the color to RGB 128,255,255 and cranking the intensity way up to 2.0. This effectively sets the entity color to 256,512,512. This color is multiplied by the texture color at each pixel and then clamped to 0-255 (the maximum color range of the monitor). That means that the brightest spots on the material will reach a full 255,255,255 white color and look really intense, while darker parts will be tinted blue: Notice the object isn't just a flat color, but has a range of color from blue to white. To get this effect I had to increase the intensity over 1.0 to create colors brighter than RGB 255,255,255, and I had to have some red in the color. If I had set the color to RGB 0,255,255 the red channel would never increase and I would have a flat color like this. Not so good: If I had set the color to RGB 128,255,255 but left the intensity at 1.0 I would also have a solid color: Finally I added a script to the model and saved it as a prefab. The script just rotates the model around slowly on its Y axis, which I think will look pretty good. I'm going to perform the rotation in the Draw() function so it doesn't get called if the object is hidden or offscreen, and I don't think anyone will notice if the rotation doesn't update when they look away: function Script:Draw() self.entity:Turn(0, 0.1 * Time:GetSpeed(), 0) end That's it for now. The next step will be to create my teleportation mechanic in VR.
  7. Developing crane simulator

    Hello everyone! I have decided to make a crane simulator (first person with view from cabin). Mostly, I aim on realism as I want this simulator to be used by real сrane operators to upgrade their skill. That's why I need an engine to easily work with physics (such as wind, rain, weight of cargo and etc). Have to mention that I don't really care about graphics quality. Maybe, but this has the least priority, after completing the whole project I would need a VR version of it, but for now I am planning just a PC version. Here is main points of my project: Physics has the major priority Using tools suitable for a beginner developer Ability to make a VR version of the simulator without rewriting of the whole project My experience in programming is mostly in computer science, so I am familiar with C++ and Python, but only on a level of coding some cool algorithms. That's why I don't really depend on a specific language. What engine and developer tools would you recommend to use? Easy to being with for a low-skilled like me, but suitable for my project. Thanks, Mike
  8. Dreamart VR is the direct interpretation of a dreamworld. The player resonates with the mere subconscious thoughts of the virtual character. The game consists of multiple scenes. Each Scene is directly or indirectly correlated to the previous scene, thus it unfolds the potential of a deep non linear VR experience. You can download the experience here - https://limbicnation.itch.io/dreamartvr E.N.J.O.Y
  9. Hello all! I'm currently in my third year on my 3D Animation & Games Development course, and I am in the process of doing some basic primary research for my dissertation project, which is to create a high quality 3D Environment for use in video games and potentially VR. I have a questionnaire (targeting other artists in the field), and I would really appreciate it if you took some time to have a look and fill it out: https://docs.google.com/forms/d/e/1FAIpQLScW12nFI8-fMlAUMNSQvOInxhnIfXpG91iRCm25TVlZufrvbQ/viewform?usp=sf_link Thank you in advance! Mike.
  10. Writing the story for Spellbound

    Spellbound is intended to be a story driven game. I feel that's the only thing which can make the game interesting on its own. The story of Spellbound has gone through a lot of evolutionary changes throughout the development process. When I initially conceived of the game, I just had a game concept: "Throw fireballs at zombies in VR, using your hands". As a game premise, that's mildly interesting but it would quickly lose its novelty and appeal. How do I make it interesting? I needed a story. Initially, my writing approach was to ask hard questions about the world: Why are there zombies? Where did they come from? Why is the wizard in a zombie infested graveyard? What's going through the wizards mind? What was his life like? What was his past? So, I tried to find answers which made sense, given that you're just some red cloaked dude in a wizard hat, slinging fireballs at zombies. The first version of the game and story was embarrassingly bad. The synopsis of the story: "You were a wizard whose wife had died, and you were searching for a way to bring her back to life because you missed her. So, you casted a spell promising to bring her back to life via resurrection, but instead, it just reanimated her and turned her into a zombie. The spell worked so well, that it also brought all of the corpses in the nearby graveyard to life as well! Your undead wife flees to the graveyard, so you have to defeat infinite waves of undead zombies. After a while, you face a big boss monster who was behind it all!" As far as stories go, that was pretty pathetic but also short. I'm a half decent writer with imagination, I know I can do better if I just spent some time to work something out. I needed to ship something playable to people, quickly. I thought that the main map would be my main game play, but it wasn't completed yet and ready for public consumption (it didn't satisfy my quality standards). So, I created an early "prelude" level. I also needed a main menu in VR, and since this is needs to be a seamless experience between game world and game menu, the menu itself can't be a static 2D screen like you'd have in traditional 2D games -- the menu itself had to be a level which you interact with. I was ruminating on story in the back of my mind for a while at this point, and I decided that I eventually wanted to have five wizards, each from a different school and theme of magic, each with unique story lines. My game universe was growing in complexity. But, I can't focus on developing the story. I need to ship as soon as possible to get something playable out there! I had chosen the "Red Wizard" as the first school of magic and theme to focus on. I didn't know what the story would really be, but I had written a really rough outline which served as a rough map on where I wanted to go with the plot. I would come back to the story much later and flesh it out, but for now, I just needed to create the prelude story and introduce players to the game universe and introduce a character or two. I wrote the prelude story in a day, polished the dialogue, and kept it somewhat vague, but also left a cliff hanger as a lead in for the main story. Then I shipped it. Currently, you can still only play the prelude and experience that story, and its short at best, but it shows the story telling model I'm using for VR: 1. I introduce an illustrated storybook and a narrator reads the first six pages. This serves as an establishing shot / context, and also establishes the narrator. 2. I fade to black, load the game world, fade in, and the story resumes from the first person perspective. The wizard talks to himself as a way to guide the player on what to do (a bit weird), and the narrator adds story as well, sort of like how a dungeon master would. 3. At the end of the VR experience, we fade to black and return to the library menu, and resume reading 1-2 illustrated pages as sort of an "epilogue", which can serve as a seamless lead-in for the next story. This month, I decided that I was a bit too aimless with my development and I needed to get more focused on shipping the next set of content. Okay, where do I begin? I don't have a level made, no story, barely any functioning spells, no crafting system, etc. What have I been wasting my time on?? Oh right, an AI system with machine learning. I realized that the pragmatic thing to do is stop everything else and focus on fleshing out the story for the red wizard. Once I have the story complete, I'll have a much better idea on the scope of the project, what scenes need to be built, what's important and what's not important, and I can start focusing on actually building my game around the story. This seems like an obviously good idea in hindsight. The story is like my game design document, and if the scope is too big, I can change the story until its achievable. So... I just have to write the story. The problem is, I just had a really rough outline on what I think the story should be about. Despite the outline, I actually don't know what the story is. Okay, so how do I figure that out? I just have to start writing. But, I can't just start writing blindly! I need to spend some time crafting the world, the characters, the history, the lore, etc! My approach to writing my story is to write out the very best first draft that I can, as completely as I can. The point is not to have a story ready for production, but to just figure out what the story is. What story am I trying to tell? Why is it interesting? What captures the readers attention and holds it? What can the audience get out of the story? What makes the story emotional? What creates a sense of wonder and amazement? What are the high points and low points of the story? Who are the protagonists? Who are the antagonists? Who are the supporting characters? What is every characters motive? Every character needs to have a flaw to be interesting, so what are the character flaws? How do those flaws get revealed? How does the character flaw play into the story? How does the story begin? What's the problem the characters are trying to solve? What's the struggle? How do the characters overcome the problem? How does the character need to grow in order to overcome the problem? How does the problem get resolved? How does the character feel about the resolution(s)? How does the audience feel about the resolution? How do we set ourselves up for introducing the next episode? Oh, and by the way, all of this has to be done in VR so we have to assume that the protagonist has total agency over decisions made, so story has to account for that. It's a bit of an overwhelming puzzle to work out. It's extremely important to note that since my game is going to be story driven, where the story either makes or breaks the final result, I cannot afford to half heartedly write a mediocre story. I have to write the greatest story I'm capable of writing. My game depends on it. The future of my one man company depends on it. My income depends on it. The story is the backbone. It's my secret sauce. My secret weapon. It's going to be what makes it a "must have" for every VR gamers library. And it can't just be a story which was shoved into a VR game, it has to be a story built from the ground up, specifically for VR, to make use of the unique story telling capabilities VR offers. So, I cannot just write out a first draft, call it good, and move forward with production. If it takes two weeks or two months to get the story perfect, then so be it. So, I'm thinking that I'm a bit of a novice when it comes to story writing. I have never published a novel. Never wrote a screen play. Never wrote a story for a game. At best, I've written a few short stories for a community college class. But, I have good story ideas, damnit! That's my stubbornness and ego peeking through, insisting that despite my lack of experience, I'm more qualified than anyone else to be the one who writes the story. How do I account for my lack of experience with "officially" not being published? I say, "It doesn't matter, I don't care, fuck it, I will just have to write 20 drafts to be on par with a professional." I think that's the right intuition though: Write 20 drafts of the same story. The first few drafts are going to be exploratory. You don't know what the story is until you've written it. You don't know who the characters are yet. You don't know their motives. The first version of the story is just a congealing of the oatmeal, where you bring it all together and sort of figure out what the real story is. This is where you answer all of the questions I listed above. You might need to write several versions of the story. Think of each version as sort of like a parallel universe, where each version can explore different possibilities in plot development. Eventually, you'll find that you're drawn to certain plot highlights and themes more strongly than others, and those become your story. At this point, you have written your story about 3-5 times. You're familiar with it, but not intimately. Now, the story becomes more like sheet music to you (the author), and it's a bit of an unfamiliar song. You can kind of play the notes and create a semblance of what the song sounds like, but it's rough and spotty. You know what notes you need to hit and when, so the only way to properly hit those notes is to practice, practice, practice. This means you're going to be rewriting your story, over and over again, each time getting more and more familiar with the plot. There isn't a fixed number of times you need to rewrite the story, but you'll know when you've written the final version: It'll flow like beautiful music off the paper, wrapping the reader in a warm hug before fleeting away. The reader will be embraced in a feeling of warmth and happiness for a moment, and then left wanting more, more, more. You've now got a page turner. A novel people can't put down. A movie which demands your attention. A game people can't stop. What happens next?! ...Turn the page to find out! I was recently encouraged by a blog article I read on the writing process of William Shakespeare. Most people think that his writings was pure genius, written from divine inspiration, and it just flowed to him easily via unnatural talent. Historical records of his writings show that actually... he wrote many, many revisions of his plays over the years. Even Shakespeare wasn't some savant writer who wrote perfect first drafts, and he's considered to be the best writer in the history of the English language. But I realized that I can't just start writing successively better iterations of the same story. There's SO much more to the story world than what people read on the pages. You know how when you pick up some fantasy books, and on the first page they have a map of the world, with kingdoms, city names, mountain ranges, rivers, oceans, and all of that stuff laid out? There is a whole story universe which the story events are set within! Each kingdom may have different politics. Different cultural customs. Different building construction aesthetics. Different values. Those background differences will and should make an impact on the story as its being told! Is slavery legal in one kingdom but not another? How does the climate affect clothing and customs? How does a traveler from one kingdom deal with the differences in culture in another? Is it a source of character conflict? What are the motives of each kingdom and its political leadership? What is the history which shaped the current state of the world? How does the past factor into any current conflicts? There's a LOT more investigatory questions to ask, but you get the idea. I realized that this narrative background stuff is very important to establish! It is literally the foundation upon which your story rests. The presence of this background scaffolding may never actually manifest in your story directly, but it is the world which contains your narrative events. If you don't build the world, your story doesn't rest on anything solid and it will be very wishy washy. So, before I started earnestly writing my actual story, I spent a lot of time writing about the world and its history. When you read my story, you are only experiencing 10% of the universe/work. The other 90% was scaffolding which was put into place, and then stripped away when it was no longer needed. People will just see the finished product and think, "Oh wow, this looks easy. I bet they just started writing from pure inspiration!", but that illusion is so far from the truth of the underlying writing process. I spent nearly a week just writing scaffolding background material. What are all the races? What are they like? What are their values? What institutions exist in the world? What is the history of the institutions? What is the common sentiment in the kingdoms? What landmarks exist? Why are they important? What creatures exist? What's their lore and background? etc. etc. You know what? I'm glad I did this. It created a nice background framework for me to work within. I, the writer, know everything about the Academy of Magic, who's really running it, where it's located, and its deep history, but the reader gets to discover little tidbits about this institution and they can gradually put it together like a puzzle. At the end, the reader may not know everything there was to know about the Academy of Magic, but maybe there will be more content later which brings those interesting details to the surface? Just think about it: How much did you know about Hogwarts after the first Harry Potter book? How much did you really know about Luke Skywalker after only watching Episode IV: A new hope? And after you experienced all of the content and had a better understanding of the world, and then watched it again, how much more sense did the actions of the characters make when you understood the background context? Anyways, I'd like to share with you a few select pieces of narrative content I've worked on recently. Keep in mind, all of this is first draft material, so there's a high likelihood that the 20th version will be very different: ~~STORY BOOK OPENS~~ Page 1: [Narrator]: “The legend of Rupert the Red… goes something like this” [Narrator]: “Over three thousand years ago, there was a grand battle between magicians of ages past. They nearly ruined the world, but instead, they set civilization back by thousands of years.” *Picture of wizards at war, volcanoes exploding, land tearing up, red sky* Page 2: [Narrator]: “The kings of old, never forgot the calamity. They unanimously decreed that henceforth…” [Kings voice]: “all magic must be banned. Those caught practicing sorcery, shall be put to death!” *Picture of kings sitting around a round table, one king is standing and leaning forward with a raised fist, addressing the other kings* Page 3: [Narrator]: And kingdoms across the lands, knew peace... With the exception of magicians. [Angry crowd]: “Burn the witches! Burn them all!” [Narrator]: “But while magicians and sorcerers can be hunted and killed, magic itself can never be extinguished. What the kings of old didn’t quite understand, is that magic itself is a gift bestowed upon mortals by the gods themselves. Oh, how they tried to kill magic though.” *Picture of an angry mob with torches and pitchforks, surrounding posts with silhouettes of people tied to them, as a massive fire burns them* Page 4: [Narrator]: The gift of magic was a sliver of the gods themselves, given to mortals to fight against darkness. When darkness came again, the kingdoms were defenseless and fell like wheat to the scythe. [People] : *anguished screams of terror* [Monsters] : *roaring, gnashing and slashing* *Picture of men, women and children being chased and killed by demon spawn. Sky is red, filled with smoke. The face of a grinning devil can be faintly seen in the clouds* Page 5: [Narrator]: A few sorcerers who had evaded the murderous clutches of men, stood united against darkness and sealed it away at heavy cost. [Magician Group]: Chanting in unison *Picture: 5 men and women, holding hands in a circle, with red, blue, white, black and green magical flame pillars, and connected lines of magical color in a star pentagram shape. In the center, stands an old man (Sassafras). Page 6: [Narrator]: The kingdoms were safe again, but the kings… they blamed the magicians for their destruction. *Picture of a group of soldiers nailing wanted posters to lamp posts* (Hammering sounds) Page 7: [Narrator]: A young boy, with the reddest hair you’d ever see, was born to a pair of humble farmers living on the edge of the Black Forest. [Baby] : Crying sounds *Picture of a crying baby being held in the arms of a mother, with a red shock of hair on its head* Page 8: [Narrator]: His father named him “Rupert”. The boy grew up, as all young boys do, and trouble followed naturally, as it does with all young boys. *Squealing pig noises and boyish laughing sounds* *Picture of a young freckle faced farm boy with a pot on his head, chasing a terrified pig with a stick* Page 9: [Narrator] : But, as fate would have it, the natural troubles of boyhood soon turned into supernatural troubles which only followed Rupert. *burning house & inferno sounds, screams* [Narrator] : Rupert was a magician. The villagers were afraid and angry. [Villagers]: “Rupert is cursed! He’s a witch! Burn him!” Page 10: [Narrator]: Rupert ran, and he ran, and he ran, deep into the black forest. The village hunters eventually gave up. (picture of rupert hiding under a stump while a dog search party with torches looks for him in the distance) *barking sounds in the distance* Page 11: [Narrator]: Rupert wandered through the forest for days, getting hungrier and hungrier. He stumbled on an old, broken tower of mossy stone, and made it his home. He lived on bark and berries. *picture of a young boy trying to eat bark in a forest, with teeth almost breaking against it* Page 12: [Narrator]: He lived for years, completely alone, terrified of the supernatural troubles which seemed to follow him everywhere. [Narrator]: Last night, Rupert discovered a book as old as time: The lost book of Sassafras. He was about to change the course of history -- FOREVER. *Picture of Rupert sleeping soundly on his back, with drool coming out of his mouth. A black crow with red eyes watches.* Snoring noises, followed by “Caw, caw! Caw!” from the crow. ~~FADE TO BLACK FROM STORYBOOK MODE, FADE INTO GAME VIEW~~ Note: Cawlin has somewhat of a German accent. [First morning, wake up] Rupert is sleeping in his bed after his late night journey into the undead infested crypts. He has been sleeping restfully for 11 hours and it is now nearly noon. An impatient crow stands at the foot of his bed. RR: "ZZZzzzz...ZZZzzz...huuuurffffgll, guuurffflllghh..." (deep snoring) Cawlin: "Cawww... Cawww... Cawkadoodlydoo! Wake up, you!" RR: "ZZZz---huh? Who said that?! Who's there?!" Rupert awakens slowly, the VR camera opens eyelids slowly, blinking awake. The player is looking down the foot of the bed at the crow. Cawlin: "Caww.." RR: "Oh… it’s just a stupid bird." Bird cocks it head to the side in curiosity. Cawlin: "Caww?" RR: "Oh, just listen to me. I'm already going mad -- first it starts with talking to the birds, then its rocks and then its trees." Cawlin: "Caw!" RR: "Say now, how did you manage to get in here? I didn't leave a window or door open last night, did I?" Cawlin: "Caw… Caw..." We wait for the player to get out of bed. They can either click the bed or walk out of the bed zone. Once they move out, we quickly fade to black and fade back in, to the wizard standing at the bedside. RR: "If I'm going to be a raving madman talking to bird brains, you must ... have a name... I shall call you..." Cawlin: "Caw... Cawlin." RR: "...Cawlin." Cawlin: "Caw! It's about time you got up, it’s well past noon! And just who might yewwwww be??" RR: "What?! A talking bird?! Now, I've certainly gone mad!" Cawlin: "Yes, yes, you’re a certified loon and I’m a crow.” (rolls eyes) Cawlin: “Now that we’ve gotten that out of the way, who are you?" RR: "Well...I'm Rupert!” Cawlin: "RRRrrrrupert… what is it that you’re doing in these woods?" RR: "This is my home! I live here." Cawlin: "Ho… how unusual... a huuuuman living in the black forest..." RR: "Unusual? ...Why?" Cawlin: “Humans haven’t ventured into the black forest for centuries. Those that do… never come out alive. There’s something… peck-uliar about you Rupert… What ees it?” *Rupert feels afraid for a moment because his secret about being magical might be given up* RR: “I… I don’t know what you’re talking about.” Cawlin: “No, there’s definitely something about you…. I can… smell eet… ah, there eet ees again! You’re… magical!” RR: “...Magical? I don’t believe in magic...” Cawlin: “You fool! Here you are, speaking with a talking bird, and you don’t believe in magic? I watched you last night as you rrrRRrroasted the walking dead with fi-yar.” RR: “Wait, you were there? You saw that?! It was real?!” Cawlin: “Of course I was... I had been waiting for you... all night! Quite the pyrrrrrotechic display, if I might say.” RR: “I still can’t quite believe what I saw. I almost thought it was just a bad dream -- I just -- haven’t been sleeping well lately.” Cawlin: “Yes, yes, it was all real. No matter! … Eet has come to my attention… that you have acquired a certain… book.” (pronounced almost like “buch”) RR: “Yeah, it was a really weird book… I heard it speak! A strange voice called out to me.” (Cawlin jumps up and down in excitement, flapping his wings) Cawlin: “Ah… do you know what you’ve found? Theees ees sooo exciting! You’ve finally found eet!” RR: “Ehh… what?” Cawlin: “The buch! The long lost book of Sassafraaaaaas! …. Eets verrry special to me. I must see it!” RR: “What’s so special about this book?” Cawlin: “Oh, eet ees only the most powerful buch of magic in the heestory of the world! It has been lost for thousands of years, but lost eet ees no more! You have eet! Eet is very special.” Cawlin: "Thees book, you know, it doesn't just get found by anyone. It... choooooses... Yes, that's the right word.. The book chooses ... who it uses. Many wizards think they use books, but never does it occur to them that the book uses them! Sassafras was it's last chosen wizard, and that was thousands of years ago! And last night, it seems to have chosen… RRRrrrrrrupert. Now, ...Why did it choose rupert?!" RR: "I don't know! I barely know anything about magic.” Cawlin: “The book must have it’s own reasons… muahahahaha” RR: "So, what now?" Cawlin: “We must read the magic buch, of course! Let’s go find eet!” Cawlin jumps onto the left shoulder of Rupert. There is no further dialogue until the player goes downstairs. A large book sits prominently on a table next to the door. It is sparkling and glowing, softly illuminating the darkness with red light. Cawlin: “Oh… there eet ees! ...thees ees so wonderful. I can feel eet… so close… yet so far.” (said in a deeper ominous voice) Cawlin flies from the wizards shoulder to go over to look at the book on the table. This helps direct the players attention. RR: “oooh...kay…” (said in the tone of, “who is this bird?”) Cawlin: “Open eet! Let’s see what secrets eet contains!” We wait for the wizard to use the book. When he uses it for the first time, the book opens and a bunch of green energy swirls from the book to the wizard. Upon the pages of the book is nothing but symbols and gibberish. RR: “What was that?!” Cawlin: “I don’t know. Magic maybe? Who cares, read the book!” Cawlin: “Well? What does eet say? What do you see?” RR: “It’s just a bunch of symbols and gibberish. I can’t read any of this!” Cawlin: “What?! Oh no...I hadn’t counted on thees. Why did eet have to be him? ... Why?” RR: “What? What do you mean?” Cawlin: “You… you don’t actually know magic. Not yet, at least.” RR: “I don’t? How is that possible? I was just throwing fireballs last night.” Cawlin: “Ahem… yes… you’re welcome for thee assistance.” RR: “Uh… what?” Cawlin: “That fire essence you used last night… I put eet there for you. Eet was just a temporary conduit for your latent magics… You don’t *actually* know how to use magic yet...” RR: "Okay, so what? How do I read this book?" Cawlin: "I don’t know. I’m just a bird, I can’t read!" RR: “So… then this book is useless to both of us.” Cawlin: “Maybe you can find a clue which could help us?” Cawlin flies back onto the left shoulder of the wizard. When the player walks away from the spellbook, it disintegrates in a puff of green particles. RR: “What happened to the book?! Where did it go?” Cawlin: “Oh… amazing! …Eet’s bound to your magical spirit. Eet ees always with you!” RR: “I don’t understand.” Cawlin: “The buch! You can call eet back at any time, and you will never lose eet! Try it now… Just focus on a hand, imagine the book in it, press your fingers inward…” We wait for the player to press the book button on the motion controller. When they do, we spawn the book in that hand in a shower of green magical glitter. Cawlin: “...and poof! There eet is! What an extraordinary book!” The book is turned to the first page, and as we look at it, some of the symbols transform into letters and words. RR: “Well -- I suppose, but again, what use is a book I can’t lose if I can’t read it?” Cawlin: “Well, It’s a magic book, and magic itself is composed of symbols or something like that -- don’t ask me, I’m just a stupid bird -- but I’m sure there’s some way you can figure out how to read those symbols? Yes? Let’s open eet and see what clues we can find!” The wizard opens the book, and on the very first page is a small set of instructions on its use, written in a poetic style: It’s an empty book It stores the spells a wizard learns It has a few left over runes from Sassafras Cawlin: “Oh, dear! The years just haven’t been kind to the pages of parchment. Even magic itself can’t protect its pages from the sands of time forever… Oh, no… oh, woe… it seems, knowledge… it has all been lost. Whatever will I do now?” RR: “Uh… you make less and less sense by the minute. You seem to know more than you’re letting on, so tell me bird, what do you know about magic and this book?” Cawlin: “Ehe. Well. ahem… Magic is just a tool used by mortals -- I mean, men… and eet can be used for evil or good. It just depends on the contents of the heart of the magician. Good magicians, naturally choose good magics, while evil magicians will choose… so called “evil” magics.” (Cawlin says “good” with disgust, and “evil” with affection) RR: “So what? How does that help us?” Cawlin: “One thing you must understand about magic, is that eet is composed of magical words and symbols. Without the proper words of a spell, there simply is no magic! So, men with the talent for magic, would often work very hard to find the proper symbols for magical spells. Sometimes, these… experiments, would go… very wrong! And they’d explode. Or turn into toads. Or become green for a day or two. Either way, playing with unknown magic is… dangerous.” Cawlin: “Once a good sequence of magical words have been found, the magicians would write them down in their spell books. Then, they could say the magic words at any time, and… POOF! The spell would just happen!” RR: “Just like that? It doesn’t sound so bad!” Cawlin: “Well, it’s not quite so easy… There are lots of symbols to choose from, and just as important as the symbol itself, is the color of the symbol! Without the right rrrrecipe, you might be using the right words but never actually working the magic.” RR: “So… magic words, magic orders, magic colors… why does it have to be so complicated?!” Cawlin: *chuckles* “heee heee hee, you’re barely even a novice. Of course it seems difficult for you now, but in the hands of a master magician, magic can be wielded to shape worlds...and… make fooooood. Like… delicious corn! Let us start there -- you haven’t had breakfast yet, have you?” RR: “I was just going to step out of the house to nibble on some delicious tree bark for breakfast…” Cawlin: “You -- with your talent for magic -- have been eating bark this whole time?! Unbelievable! It’s time to change that. Fortunately for you, and my oh, so generous mood this morning, I happen to have found a few symbols of magic.” RR: “What? You’ve been holding out on me. Why didn’t you say so sooner!” Cawlin: “Well, they won’t do you much good unless you know how to scribe them into a proper spell.” RR: “Where do I begin?” Cawlin: “First, we must go forage the forest for ingredients with magical properties. The first thing we’d like to collect, is a red pepper. Let’s go find some.” Rupert and Cawlin go wandering through the forest until they find a red pepper growing on a bush. Cawlin: “There! Right over there! A red pepper!” Rupert picks the red pepper. RR: “Okay, I’ve got the red pepper. Now what?” Cawlin: “The red pepper has the essense of red magic! That’s why it burns your mouth when you eat it. We must extract this magical essence and use it to write your first spell. Let’s go back home.” Rupert and Cawlin return to the mossy tower. Cawlin: “Everything has a bit of magic in eet. It is the job of the alchemist to extract this magic and brew bottles of magical extract. Many mortals don’t rrrrealize what they’re actually doing, but they treat these magical extracts as ‘medicines’, but it’s actually magic at work. A brewed potion has potency, depending on the skill of the alchemist and the ingredients used.” RR: “I’ve never brewed a potion. Where do I begin?” Cawlin: “Well, you don’t really have a prrrrroper alchemist work bench, so we’ll just have to use the most rrrrrrudimentary tools available to extract the magical essence from the red pepper. You must crush the red pepper between some rocks, and you’ll get a little bit of red magic essense. Try it now.” Rupert places the red pepper on a slab of rock and smashes it with a rock. A few seconds later, small vial of red liquid emerges. Cawlin: “You did it! A vial of red magic!” RR: “How do I use this?” Cawlin: “If you drank it, it would burn your mouth and upset your stomach, but we’re going to use it as ink to write magic symbols. Let’s go to your test chamber… Oh... you don’t have one. Well, that table will have to do then...” When Rupert approaches the table: Cawlin: “Fortunately, I happen to know two magical symbols -- ‘Li’ and ‘Tu’. We can write them down on a magical parchment, in any order and with any ink, and if the symbols match a spell, you’ll be able to save it in your magic book and cast it any time.” Cawlin: “To begin, grab a parchment and a quill!” Rupert performs a “use” action on parchment paper. The spell crafting UI pops up on parchment. Cawlin: “You’re barely even a novice, so you can only discover spells with two magical symbols. Later, you can cast much more complicated spells. Let’s begin with novice level magic.” Cawlin: “You don’t have a lot of parchment to work with, so you’ll need to find a spell quickly. To begin, select a symbol slot with your quill…” Rupert places his quill on a slot icon and a dialogue window pops up. Cawlin: “You only have a red magic essence, so choose that as your ink. Then, pick a symbol to write in this slot.” Rupert chooses a symbol (either “Tu” or “Li”) and writes it into the slot. After the symbol has been picked, it is written into the slot. Cawlin: “See? Even a novice can do this! Next symbol!” Rupert repeats the same process for the second symbol. RR: “Now, I’ve got two red symbols written down. Now what?” Cawlin: “Now, you try to cast these words! It’s already in your hand, so just give it a throw and see what happens…. I will just fly over here… and stay well out of the way...” Rupert throws the current magic spell. It either creates a magic spell (if correct), fizzles out, or creates a magical disaster. (Let’s assume it fizzles out) RR: “What? Nothing happened!” Cawlin: “You’re spell fizzled. Consider yourself lucky! That combination of symbols and ink was not a spell, let’s try again.” Rupert uses the parchment again. Cawlin: “This parchment is magical! As you can see, you got the right symbols and right color, but in the wrong order. Now, we can try a different sequence.” Rupert keeps trying out different symbols, until he writes out “Tu-Li” in red ink. When he gets this sequence: Cawlin: “You did it! You created your first spell! This is so exciting… I remember now! Tu-Li is fire, but your TuLi is very weak because you used a red ink with low magical potency. However, this spell is now saved in your spell book!” RR: “So, I can fling these little fire darts at any time now?” Cawlin: “Yes… you’ve begun the journey of a magician! You can find more symbols to discover other spells, and brew more potent potions to create stronger spells.” RR: “Wait a minute… my essence of red magic is gone! Did you steal it from me?!” Cawlin: “Relax yourself, Rupert! Whether you fail or discover a spell, the used ink is consumed. Magicians are always scavenging for ingredients to brew -- you magicians are scavengers, just like me!” RR: “Now what?” Cawlin: “Well, I must go. I smell a dead racoon down by the lake, and I’m absolutely starving. As for you? I saw an abandoned ruin this morning, but it was too dark and scary for me. Maybe your fire could shed some light on the situation? Or perhaps, you can find other ingredients?” RR: “You’re leaving me?!” Cawlin: “I’m getting rather...peckish. I’ll be back... Muahahaha!” Cawlin flies away and the wizard is left alone. There’s not much to do, other than hunt for ingredients or check out the abandoned ruin. At this point, we spawn clovers, blueberries, red peppers, orchids, and black lotus flowers. These are collectible ingredients which can be ground up and turned into vials. We also unlock the ancient ruins and make it accessible. Within the ruins is a new magic symbol which can be learned and a mortar and pestle. The player can summon a small flame to light their way through the darkness. There is a section of the ruin which is sealed off with a heavy door and some other strange symbols of magic. When the player emerges from the ancient ruin, the day has turned to evening. RR: “Wow, it’s evening already?” RR: “It’s getting late, I’d better get home before the forest monsters come out!” When it’s dark, we start playing large monster noises in the distant forest, mixed with snorting noises (like a sniffing pig), and something large crashing through undergrowth. RR: “There’s something out there… it’s hunting me!” Rupert returns to his wizard house. He’s tired and ready for bed. RR: “Whew, safely home at last. I need to get some sleep.” We wait for Rupert to go to sleep OR until it is 2AM in game time. Either way, we fade to black and we begin to hear snoring noises.
  11. Here we are at another milestone in the 100 days of VR challenge! Day 40! Who would have known that I would have made it to this point? We’ve come a long way, we learned a bit about Unity, made a simple game, and now here we are working in VR! Yesterday we finished fixing most of the technical problems involved with porting our game to VR. Well, turns out, I lied, there are a couple more things I’d like to fix today along with working a bit on the UI. For some reason, our camera is lower than we set it Enemies are literally running into us when trying to hit us Get our UI to show up again Step 1: Changing Our Camera Starting Position When we play the game on Unity, we have an interesting problem with our camera position being changed. We set our camera position to be 1.5 at Y: However, what’s interesting is that when we play the game on the Android platform, our camera position gets set to 0: After debugging around, I found the cause. Our GvrEditorEmulator prefab forces our Main Camera to be set to the position: 0, 0, 0. While technically, we don’t need the prefab, it is convenient, so we’ll keep it in. Instead, I’ve found a different solution. While our camera is forced to be a set position, we can child our camera to another game object and then change the position of the parent game object. Coincidentally, we’re already doing that with our Player game object. All we have to do is raise our Player Y position up by the amount of the camera. Select Player, change the Y position from 1 before to 1.5 (Optional) Go to the Main Camera and change the Y position to 0 Now when we play the game, we’ll have a much better height when playing: Step 2: Stopping Enemies from Going Inside the Player Next problem, the enemies are literally running into us. See: There could be many things that are causing the problem, but I decided to look at the Nav Mesh Agent attach to each of the enemy prefabs, because there are options that control how close the enemy would get to their target: After playing around with the settings I got the enemies to stop right in front of us: Stopping Distance: 1.25 Auto Braking: Disable Radius: 1.25 Here are our new settings: With these new settings in, here’s a bit of our gameplay now: Isn’t it nice that they’re beating us from the outside and not the inside? No? Oh well… Step 3: Getting the UI to Show Up Again Now that we have all the technical problems resolved (promise this time!), it’s time for us to go back and look at how we can get the UI to show up for our game. This is going to be an annoying problem to solve because the problem only occurs on our Android device and not anywhere else. Which means there will be A… lot… of… building… Luckily for you, I’ve gone through the torture of re-building multiple of time so the rest of us don’t have to! It turns out that getting the UI to show up isn’t too bad! It turns out that if we have VR mode enabled in Unity, any UI we have on our Android device will NOT be available unless the canvas the UI is set with is set to World Space. Here’s what it looks like on my phone when I disabled the VR options in Player Settings and just have normal Android app: According to Unity’s quick guide for VR UI, and for extra credit, this article about UI expectations for the player, UI in VR must be run in World Space. The biggest key takeaway I got from these 2 articles is that you should NEVER have the UI on your player’s screen, like how we’ve been doing it. This could cause motion sickness. Specifically, we want to use Diegetic UI, where the UI is attached to a game object in the game world as opposed to an overlay on top of our screen. Instead of having it float around or statically placed would be the better way to go. Step 3.1: Getting the UI to show up Anyways, to fix our problem and have our UI show up when VR is enabled, we must set our Canvas to be displayed in World Space. Select HUD in our hierarchy. In the Canvas Component, change Render Mode from Screen Space – Overlay to World Space There are 3 options available to use, here’s the canvas documentation to explain what they are, but for a quick summary of the available render modes: Screen Space – Overlay: The UI is rendered on top of the scene Screen Space – Camera: The UI has put a certain distance from the Camera. It’s very similar to the Overlay, except certain changes to the camera could also cause changes to the UI. An example would be Perspective Camera would render the UI differently from an Orthogonal Camera World Space: The UI Canvas will act like a game object that just hangs somewhere in the game world for us to see. Also, this is the only option we can use for our UI Here’s what our game looks like now with the World Space: Now we need to make some adjustments with our UI. Step 3.2: Figuring out where to put the UI The biggest question now at this point is, where should we put our UI elements? While there isn’t a clear answer, I think our best option might be to attach the UI to be our weapon. On the Google Cardboard, this would be the equivalent of having it right in our face, but if we were to switch to use the Daydream Viewer, we would be able to move it independently of our gaze. With the decision being made, let’s go and see how we can attach our health and time UI to our gun! Step 3.3: Putting Our UI Into World Space We already have an existing HUD canvas game object in our game. We’re going to repurpose that for our game, however, because we have more than just our UI on the canvas (the Victory and Game Over panels), I’m going to duplicate our HUD On HUD in our game hierarchy, hit Ctrl + D to duplicate it. Rename the duplicated HUD (1) to be called GunUICanvas Delete the Victory, Game Over, and Ammo UI child objects Make the GunUICanvas a child of MachineGun_01 When we’re done, here’s what your hierarchy should look like. Next up, we’re going to change the settings of our GunUICanvas so that it would be right in front of our gun on the right side: Select GunUICanvas In the Canvas component, the Render Mode should already be World Space, if not, change it In the Rect Transform component, I’ve played around with the settings and changed our position to be (-0.15, 0.22, -0.14), our Width to be 480, and our Height to be 80. Set Scale to be (0.001, 0.001, 0.001), we want our UI to be small enough to fit in our screen (Optional) Remove the Screen Manager Script Here’s what we should have: Next, I’m going to change the Anchor Presets for our Score UI game object to be in the bottom middle of our screen. Select Score In the Rect Transform component, open our Anchor Presets and hit Alt + Shift and select bottom center preset Now with the width changes of our Canvas and making our Score at the bottom, we should have something like this for our game. If we play the game in our VR device, I don’t think there will be any discomfort. The UI is positioned on the gun and since we’re focused on the cursor in the center, it’s not in our focus. Step 3.4: Connecting the UI Elements to the Rest of our Scripts Now that we have our GunUICanvas looking nice, the last thing that we need to do is re-connect all the UI elements to our scripts that use them, so our UI can get updated as we play. We need to update our: Time Text Health Slider Do you remember which scripts used these UI? No? Don’t worry I do! In GameManager, in the Score Manager script, drag our Score UI into the Score slot In Player, in the Player Health script, drag our Health Bar slider into the Health Bar slot Once we add our new UI components, our code will update the UI correctly! Conclusion And that completes Day 40! Today we looked at fixing our last technical problems with the camera height and getting the enemies to attack us at a more appropriate distance. However, the main topic of today was adding our UI back in. It turns out that with Unity VR enabled, the only Canvas we can use is World Space. Once we have made that change we can see our UI again. With our new UI, we attached it to our gun and I think we made a good solution to provide relevant UI information to the player all without causing nausea. Tomorrow, we’re going to continue working with the UI. Specifically, we’re going to add the game end states back into the game, so we can keep playing it without starting a new app instance. Day 39 | 100 Days of VR | Day 41 Home
  12. Can my PC handle a VR upgrade?

    I recently got a 600 euro gift card for Amazon. And I like messing around with VR. I played some games, and I program hobby VR games in my spare time on mobile.I saw that my 5 year old CPU meets the specs for the Oculus rift. However my current graphics card does not. I see that in Europe, I can get an "Oculus touch bundle" for 400 euros. And an Nvidia 1060(3GB) for another 200 euros. (coincidentally this is the same sum of my gift certificate) The two possible problems: I only have 4GB of RAM (I don't think this will be a big deal, maybe just some more loading times) I am not sure my power supply can support a 1060 (I am not sure what the different WATTages that are printed on it mean ) Here is a photo of my power supply's spec: https://ibb.co/eaOOOR It was pretty basic even 6 years ago. Is it good enough to run a 1060? (I can only find a link for the 6GB model: https://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-1060/specifications) I am not interested in going through the trouble of replacing my power supply, and I do not want to spend more than 600 euros on this entire endavour. And another question: Assuming I want to spend this money on VR, is there a better platform to spend this sum on? I was going to wait for the Vive focus (I don't liike cables), but it looks like it's going to be a dud (only in china, no real ecosystem). Facebook's claim of releasing a wireless rift for 200$ seem dubious to me. And since that price sounds like they are selling at a loss, there is no chance it will reach Europe for that price on official channels.
  13. Unity has put up an official release of EditorXR (formerly EditorVR). https://labs.unity.com/article/editorxr-and-scenefusion-update If you haven't tried it lately, you should. This release is actually very good,
  14. itSeez3D Announces Avatar SDK plugin for Unity

    itSeez3D, a leading developer of mobile 3d scanning software, announced today a new SDK for its automatic 3D avatar generation technology, Avatar SDK for Unity. The Avatar SDK for Unity is a robust plug-n-play toolset which enables developers and creatives to integrate realistic user-generated 3D avatars into their Unity-based applications. SDK users can allow players to create their own avatars in the application or integrate the SDK into their own production processes for character design and animation. “Virtual avatars have recently become increasingly popular, especially in sports games and social VR apps. With the advance of VR and AR, the demand to get humans into the digital world is only increasing”, said Victor Erukhimov, itSeez3D CEO. “Our new Avatar SDK for Unity makes it super-easy to bring the avatar technology into any Unity-based game or VR/AR experience. With the Avatar SDK for Unity now every developer can bring face scanning technology into their games and allow players to create their own personalized in-game avatars, making the gameplay much more exciting and immersive.” Key features of the Avatar SDK for Unity: Automatic generation of a color 3D face model from a single selfie photo in 5-10 seconds (!). Works best with selfies, but can be used with any portrait photo. Shape and texture of the head model are unique for each person, synthesized with a deep learning algorithm crafted by computer vision experts Head models support runtime blendshape facial animations (45 different expressions) Generated 3D heads include eyes, mouth, and teeth Algorithms synthesize 3D meshes in mid-poly resolution, ~12k vertices, and ~24k triangles Six predefined hairstyles with hair-recoloring feature (many more available on request) Avatar generation API can be used in design-time and in run-time, which means you can allow users to create their own avatars in your game Cloud version is cross-platform, and offline version currently works on PCs with 64-bit Windows (support for more platforms is coming soon) Well-documented samples showcasing the functionality. Availability The Avatar SDK for Unity is offered in two modes - “Cloud” and “Offline”. The “Cloud” version is available at http://avatarsdk.com/ and the “Offline” version is available by request at support@itseez3d.com. ### About itSeez3D At itSeez3D, we are working on the computer vision technology that turns mobile devices into powerful 3D scanners. itSeez3D has developed the world's first mobile 3D scanning application that allows to create high-resolution photorealistic 3D models of people's' faces, bodies and objects. The application is available for iOS and Windows OS mobile devices powered with 3D cameras. In 2016 the company introduced Avatar SDK that creates a realistic 3D model of a face from a single selfie photo. To learn more about itSeez3D scanning software and 3D avatar creation technology, please visit www.itseez3d.com and www.avatarsdk.com.
  15. itSeez3D Announces Avatar SDK plugin for Unity

    itSeez3D, a leading developer of mobile 3d scanning software, announced today a new SDK for its automatic 3D avatar generation technology, Avatar SDK for Unity. The Avatar SDK for Unity is a robust plug-n-play toolset which enables developers and creatives to integrate realistic user-generated 3D avatars into their Unity-based applications. SDK users can allow players to create their own avatars in the application or integrate the SDK into their own production processes for character design and animation. “Virtual avatars have recently become increasingly popular, especially in sports games and social VR apps. With the advance of VR and AR, the demand to get humans into the digital world is only increasing”, said Victor Erukhimov, itSeez3D CEO. “Our new Avatar SDK for Unity makes it super-easy to bring the avatar technology into any Unity-based game or VR/AR experience. With the Avatar SDK for Unity now every developer can bring face scanning technology into their games and allow players to create their own personalized in-game avatars, making the gameplay much more exciting and immersive.” Key features of the Avatar SDK for Unity: Automatic generation of a color 3D face model from a single selfie photo in 5-10 seconds (!). Works best with selfies, but can be used with any portrait photo. Shape and texture of the head model are unique for each person, synthesized with a deep learning algorithm crafted by computer vision experts Head models support runtime blendshape facial animations (45 different expressions) Generated 3D heads include eyes, mouth, and teeth Algorithms synthesize 3D meshes in mid-poly resolution, ~12k vertices, and ~24k triangles Six predefined hairstyles with hair-recoloring feature (many more available on request) Avatar generation API can be used in design-time and in run-time, which means you can allow users to create their own avatars in your game Cloud version is cross-platform, and offline version currently works on PCs with 64-bit Windows (support for more platforms is coming soon) Well-documented samples showcasing the functionality. Availability The Avatar SDK for Unity is offered in two modes - “Cloud” and “Offline”. The “Cloud” version is available at http://avatarsdk.com/ and the “Offline” version is available by request at support@itseez3d.com. ### About itSeez3D At itSeez3D, we are working on the computer vision technology that turns mobile devices into powerful 3D scanners. itSeez3D has developed the world's first mobile 3D scanning application that allows to create high-resolution photorealistic 3D models of people's' faces, bodies and objects. The application is available for iOS and Windows OS mobile devices powered with 3D cameras. In 2016 the company introduced Avatar SDK that creates a realistic 3D model of a face from a single selfie photo. To learn more about itSeez3D scanning software and 3D avatar creation technology, please visit www.itseez3d.com and www.avatarsdk.com. View full story
  16. Welcome back to day 39! Yesterday we started to look at fixing problems that involved the limitation of Mobile VR (and a lot of raycasting), today we’re going to make some more changes. Specifically, the goal today is: Change our Event Trigger logic to deal with what happens if we’re holding down on the screen Fix a problem with our player being pushed around Fix why our Knight turns black Change our enemy to be slower to make the game easier Let’s get to it! Step 1: Changing Our Event Trigger Logic for Continuous Fire Right now, we’re trying to solve the problem where we only damage the enemies when we tap on them. If we were to hold on, then we would continue to shoot, but the enemies won’t take damage Yesterday we discovered that there was 2 ways we could have implemented our game. Use our existing code where we shoot a raycast and depending on what we hit, run some function. Use the Event Trigger system along with Google’s changes. I’ve played around quite a bit with the Event Trigger system and made a solution, but it’s not the best, in fact, I might have preferred just keeping what we have, but that’s okay, we’re just learning! There are 2 problems that we must solve: What happens when we’re holding down the screen on an enemy What happens when we’re holding down and then we move to point at another enemy. After playing around for a while, the PointerClick solution we have will no longer work. Instead, I’ve started playing with the PointerEnter and PointerExit events. I’m going to add the changes to EnemyHealth: using System; using UnityEngine; using Random = UnityEngine.Random; public class EnemyHealth : MonoBehaviour { public float Health = 100; public AudioClip[] HitSfxClips; public float HitSoundDelay = 0.1f; private SpawnManager _spawnManager; private Animator _animator; private AudioSource _audioSource; private float _hitTime; private Boolean _isEnter; void Start() { _spawnManager = GameObject.FindGameObjectWithTag("SpawnManager").GetComponent<SpawnManager>(); _animator = GetComponent<Animator>(); _hitTime = 0f; _isEnter = false; SetupSound(); } void Update() { _hitTime += Time.deltaTime; if (Input.GetButton("Fire1") && _isEnter) { TakeDamage(1); } } private void TakeDamage(float damage) { if (Health <= 0) { return; } if (_hitTime > HitSoundDelay) { Health -= damage; PlayRandomHit(); _hitTime = 0; } if (Health <= 0) { Death(); } } private void SetupSound() { _audioSource = gameObject.AddComponent<AudioSource>(); _audioSource.volume = 0.2f; } private void PlayRandomHit() { int index = Random.Range(0, HitSfxClips.Length); _audioSource.clip = HitSfxClips[index]; _audioSource.Play(); } private void Death() { _animator.SetTrigger("Death"); _spawnManager.EnemyDefeated(); foreach (Collider collider in GetComponentsInChildren<Collider>()) { collider.enabled = false; } } public void HealthEnter() { _isEnter = true; print("enter"); } public void HealthExit() { _isEnter = false; print("exit"); } } Walking Through the Code In our EnemyHealth, we create 2 new functions: HealthEnter() HealthExit() These functions are going to be called from the PointerEnter and PointerExit from our Event Trigger that we set up. In these functions, we set new variable we introduced called _isEnter so we know when they’re being selected. Inside Update() we check to see if we’re currently hovering over the enemies and if we’re pressing down on the screen. If we are, we would call our already existing Shoot function. I’m not a fan of this method because it requires us to constantly call Update() in all of our enemy health scripts as opposed to just inside our PlayerShootingController script, but for just playing around, this is okay. I also changed the hit sound effect to be able to play every 0.1 seconds just like our shooting delay. Step 1.1: Updating our Event Trigger Now that we have our shooting script in, the next and final thing we need to do is to create the Event Triggers to use them. Get rid of the PointerClick event that we’ve previously set up. Instead, we’re going to create 2 new types of Event Triggers: PointerExit and PointerEnter. Here’s what we’re going to do: Attach HealthEnter() to PointerEnter Attach HealthExit() to PointerExit Now we can play our game like we intended to do from the very beginning. Make sure to make these changes to the Bandit and Zombie too! Step 2: Preventing Our Player from Falling Currently, if we were to play the game, when an enemy gets closed to our player character, we would fall. We should have addressed this in the past when we set constraints inside our RigidBody component, but it appears that we have not. Let’s go back and fix this In the hierarchy, select Player Under the RigidBody component, we’re going to set our constraints. Specifically, we want to freeze our position and rotation so that the enemies won’t push our character around. Having the game move us could cause nausea for our players. Step 3: Fixing our Knight’s Color If we were to play our game on our mobile device, we’ll notice one big problem. Our knights are all black. If we pay attention to our log, we’ll see this: It seems that we have some problems with the shaders that the asset is using. Unfortunately, I don’t know enough about this problem to resolve this. We have 2 choices: Ignore the problem and just have all black knights Change the materials that use these to use the standard shader. In our case, we’re going to explore the 2nd option. In Assets/Knight/models/Materials we have 3 materials that we’re using: clothColor, knight1Color, weaponsColor, and all of them uses one of those 2 shaders above. Let’s select them and changed them to Standard. Now if we were to play the game on Unity, here’s what the knights would look like: It lost the coloring we originally had for it, but at least we keep the details of the models. Step 4: Making the Game Easier Currently, in our game, we would be getting swarmed with enemies. That would have been fine if we can move around, unfortunately, we can’t do that anymore, thus as a result, we need to make some adjustments We’re going to do 2 things: Change the rate of how long it takes for an enemy to spawn Slow down the rate our enemies move Step 4.1: Changing Spawn Rate on the Spawn Manager Currently, we spawn the next enemy every 2 seconds. Let’s change that to 5. Select the SpawnManager game object (child of GameManager) Set Time Between Enemies to be from 2 to 5 Step 4.2: Changing Enemy Walking Speed As we might recall, to control the enemy speed, we must look at the Nav Mesh Agent component in our enemy prefabs. In order of speed, our speed order is Bandit, Knight, Zombie, with the Bandit being the fastest and Zombie being the slowest. I’m going to change the speed a bit. Bandit to 2 Knight to 1.5 Zombie to 1 Here’s an example: Conclusion Now we’re done! We have taken care of a lot of the technical problems that we encountered. Tomorrow, we’re going to continue to finish the rest of the game by figuring out how we would add UI into a virtual reality environment. Until then, I’ll see you all later in day 40! Day 38 | 100 Days of VR | Day 40 Home
  17. I wanted to start by sharing some amazing video Making Of, medical environments visual effects using 3dsMax and thinkingParticles as can be seen here: cebas vimeopro Please let me know 1) is 3dsMax environment compatible with the main game engines : Unreal and Unity? 2) Anyone with games that uses medical molecular environments? If you wish to get to know the story, see here: Random42 Feature Exclusive Story Also, follow at facebook.com/ScientificVisuals for upcoming tutorial on how to make a blood repair / clotting visual effects by Callum Welsh.
  18. Welcome to Day 38! Today, we’re going to talk about the limitations of mobile VR and make some changes in our game to fix things. We’ve already started to fix some things, specifically adding event triggers to our enemies, but there’s still many more things to solve! Here’s a quick list of things I want to tackle from what we encountered 2 days ago: From a technical limitation: We can’t move We only have one input which is clicking Some actual technical problems: The enemies are all black color We don’t have any of our UI’s anymore We’re going to address these problems over the next couple of days. Today, we’re going to focus on the technical limitations of Mobile VR, today’s priorities are: Discussing how to change our game design to accommodate our new limitations Implementing our new designs Edit, Important Note: After playing around with the Cardboard in Unity today and looking at this article about Google Cardboard’s inputs. It seems that we don’t have to use Google VR SDK. Unity already has most of the internal integration necessary to make a VR app Everything we had already works, the reason why there I initially thought there was a problem is, because of how we did raycasting. Specifically, our raycasting code targeted where our mouse/finger was touching, not the middle of the screen! More on this later. Step 1: Changing the Game to Fit our Mobile Limitations Like mentioned before, in the Google Cardboard, we have 3 limitations: We can’t move our characters position We only have tapping as an input to interact with the game Our cursor will always be in the middle of the screen Even for the Daydream Viewer, we will have the first 2 limitations. However, with the new Daydream Standalone device coming out, we’ll have World Space, finally allowing us to track the player’s movements without requiring external devices like what the Vive does! Anyways, back on topic. Considering these 3 limitations, here are my thoughts of what needs to be changed in our game: Because we can’t move, we should place our character in a more centered location for the enemies to reach us Because we can no longer run away, we should make the enemies weaker so that we don’t get swarmed Because we only have one input, we can shoot, but we can’t reload, we should get rid of the reload system Essentially, we’re going to create a shooter with our player in the center with enemies coming from all around us. Step 2: Implementing Our New Designs Now that we have everything we want to do planned, let’s get started in the actual implementation! Step 2.1: Placing the Character in the Middle Let’s place the character in the middle of where our spawn points are set. After playing around with it, I think the best spot would be at Position: (100, 1, 95) Select Player in our hierarchy. In the Transform component, set our Position to be X: 100, Y: 1, Z: 95 Step 2.2: Making the Enemies Weaker Next up, let’s make the enemies weaker. In the Enemy Health script component attached to our Knight, Bandit, and Zombie prefab, let’s change their health value. In order of our health, the order of size from largest to smallest is: Zombie > Knight > Bandit. Let’s set the health to be: Zombie: 4 HP Knight: 2 HP Bandit: 1 HP Here’s how we change our health: In Assets > Prefabs select our prefabs, in this case, let’s choose Zombie. In the Inspector, select the Enemy Health (Script) component and change Health to be 4 Do the same change with the other 2 prefabs. Step 2.3: Remove our ammo system Now it’s time to back to our Player Shooting Controller (Script) Component that we disabled yesterday. I want to keep the animation and sound effects that we had when shooting our gun, however, I’m going to get rid of the ammo and the need to reload. Here are my changes: using UnityEngine; using System.Collections; public class PlayerShootingController : MonoBehaviour { public float Range = 100; public float ShootingDelay = 0.1f; public AudioClip ShotSfxClips; public Transform GunEndPoint; //public float MaxAmmo = 10f; private Camera _camera; private ParticleSystem _particle; private LayerMask _shootableMask; private float _timer; private AudioSource _audioSource; private Animator _animator; private bool _isShooting; //private bool _isReloading; //private LineRenderer _lineRenderer; //private float _currentAmmo; //private ScreenManager _screenManager; void Start () { _camera = Camera.main; _particle = GetComponentInChildren<ParticleSystem>(); Cursor.lockState = CursorLockMode.Locked; _shootableMask = LayerMask.GetMask("Shootable"); _timer = 0; SetupSound(); _animator = GetComponent<Animator>(); _isShooting = false; //_isReloading = false; //_lineRenderer = GetComponent<LineRenderer>(); //_currentAmmo = MaxAmmo + 10; //_screenManager = GameObject.FindWithTag("ScreenManager").GetComponent<ScreenManager>(); } void Update () { _timer += Time.deltaTime; // Create a vector at the center of our camera's viewport //Vector3 lineOrigin = _camera.ViewportToWorldPoint(new Vector3(0.5f, 0.5f, 0.0f)); // Draw a line in the Scene View from the point lineOrigin in the direction of fpsCam.transform.forward * weaponRange, using the color green //Debug.DrawRay(lineOrigin, _camera.transform.forward * Range, Color.green); if (Input.GetButton("Fire1") && _timer >= ShootingDelay /*&& !_isReloading && _currentAmmo > 0*/) { Shoot(); if (!_isShooting) { TriggerShootingAnimation(); } } else if (!Input.GetButton("Fire1") /*|| _currentAmmo <= 0*/) { StopShooting(); if (_isShooting) { TriggerShootingAnimation(); } } /*if (Input.GetKeyDown(KeyCode.R)) { StartReloading(); }*/ } private void StartReloading() { _animator.SetTrigger("DoReload"); StopShooting(); _isShooting = false; //_isReloading = true; } private void TriggerShootingAnimation() { _isShooting = !_isShooting; _animator.SetTrigger("Shoot"); //print("trigger shoot animation"); } private void StopShooting() { _audioSource.Stop(); _particle.Stop(); } public void Shoot() { //print("shoot called"); _timer = 0; Ray ray = _camera.ViewportPointToRay(new Vector3(0.5f, 0.5f, 0f));//_camera.ScreenPointToRay(Input.mousePosition); RaycastHit hit = new RaycastHit(); _audioSource.Play(); _particle.Play(); //_currentAmmo--; //_screenManager.UpdateAmmoText(_currentAmmo, MaxAmmo); //_lineRenderer.SetPosition(0, GunEndPoint.position); //StartCoroutine(FireLine()); if (Physics.Raycast(ray, out hit, Range, _shootableMask)) { print("hit " + hit.collider.gameObject); //_lineRenderer.SetPosition(1, hit.point); //EnemyHealth health = hit.collider.GetComponent<EnemyHealth>(); EnemyMovement enemyMovement = hit.collider.GetComponent<EnemyMovement>(); if (enemyMovement != null) { enemyMovement.KnockBack(); } /*if (health != null) { health.TakeDamage(1); }*/ } /*else { _lineRenderer.SetPosition(1, ray.GetPoint(Range)); }*/ } // called from the animation finished /*public void ReloadFinish() { _isReloading = false; _currentAmmo = MaxAmmo; _screenManager.UpdateAmmoText(_currentAmmo, MaxAmmo); }*/ private void SetupSound() { _audioSource = gameObject.AddComponent<AudioSource>(); _audioSource.volume = 0.2f; _audioSource.clip = ShotSfxClips; } public void GameOver() { _animator.SetTrigger("GameOver"); StopShooting(); print("game over called"); } } I’ve kept what I commented out, here’s the clean version of our script. using UnityEngine; using System.Collections; public class PlayerShootingController : MonoBehaviour { public float Range = 100; public float ShootingDelay = 0.1f; public AudioClip ShotSfxClips; public Transform GunEndPoint; private Camera _camera; private ParticleSystem _particle; private LayerMask _shootableMask; private float _timer; private AudioSource _audioSource; private Animator _animator; private bool _isShooting; void Start () { _camera = Camera.main; _particle = GetComponentInChildren<ParticleSystem>(); Cursor.lockState = CursorLockMode.Locked; _shootableMask = LayerMask.GetMask("Shootable"); _timer = 0; SetupSound(); _animator = GetComponent<Animator>(); _isShooting = false; } void Update () { _timer += Time.deltaTime; if (Input.GetButton("Fire1") && _timer >= ShootingDelay) { Shoot(); if (!_isShooting) { TriggerShootingAnimation(); } } else if (!Input.GetButton("Fire1")) { StopShooting(); if (_isShooting) { TriggerShootingAnimation(); } } } private void TriggerShootingAnimation() { _isShooting = !_isShooting; _animator.SetTrigger("Shoot"); } private void StopShooting() { _audioSource.Stop(); _particle.Stop(); } public void Shoot() { _timer = 0; Ray ray = _camera.ViewportPointToRay(new Vector3(0.5f, 0.5f, 0f)); RaycastHit hit = new RaycastHit(); _audioSource.Play(); _particle.Play(); if (Physics.Raycast(ray, out hit, Range, _shootableMask)) { print("hit " + hit.collider.gameObject); EnemyMovement enemyMovement = hit.collider.GetComponent<EnemyMovement>(); if (enemyMovement != null) { enemyMovement.KnockBack(); } } } private void SetupSound() { _audioSource = gameObject.AddComponent<AudioSource>(); _audioSource.volume = 0.2f; _audioSource.clip = ShotSfxClips; } public void GameOver() { _animator.SetTrigger("GameOver"); StopShooting(); print("game over called"); } } Looking through the Changes We removed a lot of the code that was part of the reloading system. We basically removed any mentions of our ammo and reloading, however, I kept the changes involved with the shooting animation, shooting sound effects, and shooting rate. There were only 2 changes that were made: I changed the input we use to shoot from GetMouseButton to GetButton(“Fire1”), I believe this is the same thing, but I’m making the change anyways. Either option returns true when we’re touching the screen on our mobile device. I also changed our Ray from our raycasting system. Before casted a ray from where our mouse was located at, which before we fixed at the center. However, after we got rid of the code that fixed cursor to the middle, we needed a new way to target the middle. Instead of firing the raycast from our mouse, we now fire the raycast from the middle of our camera, which will fix our problem with our mobile device. Go ahead and play the game now. We should be able to have a playable game now. There are 2 things that will happen when we shoot: We’ll shoot a raycast and if it hits the enemy, they’ll be pushed back The enemies trigger event will detect that we clicked down on the enemy, so they’ll take some damage At this point, we have a problem: if we were to hold down the screen, we’ll push the enemy back, but they’ll only be hit once! That’s because we only have that deals with an OnClick event, but not if the user is currently selecting them. We’re going to fix this problem tomorrow, but I’ve done a lot of investigation work with raycasts now and want to take a break! Step 2.4: Changing the ScreenManager script One more thing we need to do before we leave. The Unity compiler would complain about a missing reference with our ScreenManager, specifically with the MaxAmmo variable that we got rid of. Let’s just get rid of it: using UnityEngine; using UnityEngine.UI; public class ScreenManager : MonoBehaviour { public Text AmmoText; void Start() { { PlayerShootingController shootingController = Camera.main.GetComponentInChildren<PlayerShootingController>(); //UpdateAmmoText(shootingController.MaxAmmo, shootingController.MaxAmmo); } } public void UpdateAmmoText(float currentAmmo, float maxAmmo) { AmmoText.text = currentAmmo + "/" + maxAmmo; } } And we’re good to go! Technically speaking, we won’t be using this script anymore either. Conclusion And another day’s worth of work has ended! There’s a lot of things I learned about VR, such as: we don’t need ANYTHING that the Google VR SDK provides! Unity as a game engine already provides us with everything we need to make a VR experience. Google’s SDK kit is more of a utility kit that help make implementation easier. The TLDR I learned today is that we don’t have to be fixed on using Unity’s Raycasting script, we don’t need it. We can continue to use what we already have. However, for the sake of learning, I’m going to continue down re-implementing our simple FPS with the Google Cardboard assets! We’ll continue tomorrow on Day 39! See you then! Day 37 | 100 Days of VR | Day 39 Home
  19. Writers Note: Sorry for the lack of updates, I'm not dead yet! I went on a short vacation and got horribly sick. I'm hoping to continue as I have been going before. Welcome to Day 37! Today, things are finally going to get serious in working with VR! Currently, there are a lot of problems with the app from when we launch it. Some are just limitations and others are actual problems. However, before we start to go in and fix everything that we encountered yesterday, today we’re going to add the Google VR SDK into our game. Today we’re going to: Set up wireless debugging so we can both: debug and receive console print statements from our game Remove the old scripts that we’ll replace with VR Add the Google VR SDK into our game Set up a simple Event Trigger Today, we’re going to start working in VR, so let’s not waste time and get to it! Step 1: Setting Up Wireless Debugging/Consoles Before we do anything, one of the most important thing we should do is to set up remote debugging or at the very least, the ability to have our console statements be sent to Unity untethered. Currently, in Development mode, we only get console logs from our Android device if our phone is connected to our computer. This wire would become too limiting if we must do something like spin around in a circle. To fix this, we’re going to set up wireless debugging where our phone can send data remotely to Unity. We’re going to follow Unity’s documentation on attaching a MonoDevelop Debugger to an Android Device. The instructions are straightforward, so I’ll just leave the link to the instruction. In our current state, because we have no way of restarting the game, we must rebuild and run every single time we want to see the console wirelessly. The reason being we lost the ability to restart the game inside the game. However, when we re-add our ability to restart the game, wireless debugging will be more useful. Step 2: Removing Old Scripts that Needs VR Replacements It’s finally time to start working in Unity! Before we do anything, let’s think about the things that the Google VR SDK gave us and what must we get rid of in our current system that conflicts with the VR SDK. The main thing that the Google VR SDK provides is: The ability to move the camera with our head Its own Raycasting system What we need to remove from our game is: The ability to move our character The ability to move our camera The ability to shoot The crosshair UI Luckily for us, this process is going to be fast and easy. First, let’s remove our ability to move: In our game hierarchy, select the Player game object. Select the little cog on the top right-hand corner of the Player Controller (Script) and select Remove Component Next, let’s get rid of the game following our mouse. Select Player > Main Camera Remove our Mouse Camera Controller (Script) Component After that, let’s get rid of the shooting script. We’re going to come back later and re-purpose this script, but that’s going to be for a different day: Select Player > Main Camera > MachineGun_00 Disable the Player Shooting Controller (Script) We’re still going to need this. Finally, let’s get rid of the crosshair. As you recall, when we add the VR SDK, we get a gaze prefab that already adds a cursor in for us. Select HUD > Crosshair and delete it from our hierarchy. When we’re done, we’ll have a completely unplayable game! Yay…. Step 3: Adding the Google VR SDK in Recalling from the Google Cardboard demo, for our game, we’ll need to add: GvrEditorEmulator – to simulate head movement GvrEventSystem – to use Google’s Event System for dealing with raycasting GvrReticlePointer – for our gaze cursor GvrPointerPhysicsRaycaster – The Raycaster that GoogleVR uses to hit other objects The set up for this will also be very straightforward. Drag GvrEditorEmulator in Assets > GoogleVR > Prefabs > GvrEditorEmulator to the hierarchy Drag GvrEventSystem in Assets > GoogleVR > Prefabs > EventSystem to the hierarchy Drag GvrReticlePointer in Assets > GoogleVR > Prefabs > Cardboard to be the child of Main Camera Selectcs from Assets > GooglveVR > Scripts > GvrPointerPhysicsRaycaster.cs and attach it to our Main Camera. When we’re done, we’ll have something like this: Now with these prefabs and scripts in, we can rotate and look around our game by holding Alt. We can also shoot our raycasts with our VR Raycaster, however right now we don’t have an Event Trigger set up in our enemies that will detect them getting hit. Let’s do that! Step 4: Setting Up an Event Trigger Before we end today, I want to make a simple event trigger that allows us to be able to defeat an enemy. Luckily for us, we already have the function available to us! Specifically, inside our Enemy Health script, we have a code that we call to damage an enemy. Let’s set this up. We want to get something like this: For now, we’re only going to change our Knight enemy. Here’s what we’re going to do: Select our Knight prefab in Assets > Prefab > Knight Add an Event Trigger Component to our prefab. Click Add New Event Type to select what type of event we want to listen for Select PointerClick Now click + to add the object we want to access the scripts of. Drag our Knight Prefab into the empty Object slot Then we need to select the function to call: EnemyHealth > TakeDamage(float) Set the float value we pass in as 1 When we play our game now, when our gazer focuses on an enemy and we click, we’ll shoot him! There are a lot of things that we’re missing like the push back, but we can start focusing on the rest of that tomorrow! Now let’s do that to the rest of our prefabs: Bandit and Zombie! Conclusion There we have it! Our first dive into doing some work with VR. It turns out right now, there’s a lot less code that needs to be written, instead, a lot of it is just putting prefabs and scripts to the correct location so our game would work. Either way, now we have a game that is playable. Tomorrow, we’re going to discuss what changes that we should do to make a better VR experience. Or at the very least, as good as it was before we try to VR-ify it! Phew, it’s been a long day, I’ll see you all tomorrow on day 38! Day 36 | 100 Days of VR | Day 38 Home
  20. For the past few years in a row, mobile games dominated the app stores regarding revenue, download number and engagement. No other app niche has shown such huge level of activities and soaring numbers as the mobile games. Naturally, mobile games also have been the most happening niche in respect of new technologies and innovations. From Augmented Reality and Virtual Reality games to wearable gaming, new technologies are continuing to shape gaming experience. Mobile game marketing has also come of age and now has become more mature than ever before. The era of so-called ‘freemium’ games, gated features and in-app monetisation tactics look common, and every game marketer is on the lookout for a new way to market and generate revenue. Considering all these evolving attributes, 2018 has really been a happening year for mobile games. Let us introduce here some of the key trends that will shape mobile game development in 2018. 1. VR and AR Gaming When Pokémon GO was released and literally took the world by storm with its never-before gaming experience in 2016, many experts just didn't think twice to mark that year with the rise of VR and AR games. But that was just the beginning of the kind of mobile games which will continue to dominate the gaming scene for the mobile and wearable users in the years to come. The success of Pokemon Go actually became a motivating factor for game developers worldwide, and such reality defining games continued to come making the scene even more competitive. Though 2017 has not seen any new era defining AR or VR games like Pokemon Go, both the technologies have been consolidated and became mainstream as the gaming technologies of the future. 2. Mobile games for the elderly Certainly, we no longer consider mobile games to be the child's plaything. It is equally for the elderly people, grownup youths, matured ladies and people of all ages. For the past several years we have seen hordes of game apps to come up targeted for elderly population or people outside of common game-playing demographics. In 2017, there have been hundreds of games for elderly, working men and women and all other age groups. With many games enjoying the favour of an addicted game playing audience, this trend is going to stay dormant in the time to come. 3. Wearable gaming If we are not terribly wrong, wearable devices can easily be predicted as the next mass mobilising device platform after the smartphones. Naturally, mobile gaming is also supposed to shift its load to the wearable gaming apps. Even if the mobile games are to remain popular because of their large screen gaming experience, the quick to play gaming experience of the smartwatches will continue to remain popular. Moreover, offering an extended gaming experience from the mobile device to the smart wearables, they will help people stay engaged with a game irrespective of the device. 4. Social gaming Social gaming is already a hot trend as most of the mobile games are keen to promote the gaming experience by allowing players to invite their players onboard for a gameplay. From a game of pool to most complex and strategy games, most games these days allow players inviting their friends on social media. Moreover, quick social registration for mobile games is already helping games garner access to more people through the social contacts of the player. By incentivising social gaming in many ways, a game app can further push players to engage their friends on social media. 5. Game consoles getting outdated Game consoles like the PlayStation and Xbox are still there, and it is not a coincidence that they are actually getting better with the rise of mobile gaming. In spite of getting better and richer to compete the mobile games, gaming consoles because of their expensive price tag and difficulty of handling will only attract less and less people like the game playing audience. Mobile gaming with high-end sophisticated experience and extreme ease of use will continue to hold the charm that no other gaming devices really can. With the unprecedented rise of mobile gaming in recent times, game consoles are only becoming less competitive options for the gaming audience. 6. Custom game features We are already aware of the power of customisation for the engaging audience. Custom features make a player feel important, and it is something everyone likes almost invariably. It is no different when it comes to mobile games. Mobile games allowing players to choose features that they find enjoyable for their game playing experience, will obviously give them more reasons to stick to a particular game. The custom game features allowing players shaping their own gaming experience have been very popular with mobile games this year. 7. Multichannel access Average smartphone users have access to several gaming devices simultaneously, and this is why allowing unperturbed game play across multiple devices and platforms became very important. Game developers also find it helpful to build a cross-platform game to boost the chances of getting discovered easily across app stores. While engaging game players continuously irrespective of the device in use is one of the most important considerations for the marketing success of a game, allowing unperturbed streaming of the game across devices is crucial. 8. A renewed interest in retro games There has been a renewed interest in the old style mobile games, at least for their look and feel. Dubbed as retro games the new breed of games are introducing the look and feel of older mobile games. This new approach allowing young players having gaming experience of a different generation became quite popular throughout this year. In summation To conclude, in 2017 we have seen several definitive game trends to unfurl allowing new scopes for marketers and developers. Among these trends, the above-mentioned ones seem to have a far-reaching echo than the rest.
  21. Oculus CTO John Carmack has posted the latest Public VR Critique: Daedalus, running on the Gear VR. Daedalus is a platformer and exploration game set in a surrealist world. Carmack appears to really enjoy this game, mentioning that at "OC4, one of the VR apps presented for my review session was Daedalus...I played through the first couple training segments on stage, giving some of my common feedback, but what was noteworthy was that I didn't want to quit - I had to make myself stop to get on to the rest of the slate lined up." In the critique, Carmack covers Daedalus' gameplay, artistic feel, and the pros/cons of the art style in mobile VR. He also reviews performance on Gear VR: Check out the full critique here.
  22. Oculus CTO John Carmack has posted the latest Public VR Critique: Daedalus, running on the Gear VR. Daedalus is a platformer and exploration game set in a surrealist world. Carmack appears to really enjoy this game, mentioning that at "OC4, one of the VR apps presented for my review session was Daedalus...I played through the first couple training segments on stage, giving some of my common feedback, but what was noteworthy was that I didn't want to quit - I had to make myself stop to get on to the rest of the slate lined up." In the critique, Carmack covers Daedalus' gameplay, artistic feel, and the pros/cons of the art style in mobile VR. He also reviews performance on Gear VR: Check out the full critique here. View full story
  23. Welcome back to Day 36! Yesterday we set up our mobile device to be able to play VR device and there’s nothing quite like that experience, especially if you’re the one that “made it”. If you’ve made it this far in the journey, but you haven’t tried using the Cardboard or other VR devices, I highly recommend trying it once! Now… with my pitch out of the way, let’s talk about what we’re going to do today! We finally started working in VR, today, we’re going to try and convert our simple FPS into a VR experience. This will be a multi-day progress. Today, I want to: Do some Asset clean ups for our app so we don’t have to spend forever building Set up our Google VR tools Play our game Getting the Project GitHub Repository Before we get started. I realize that not everyone (or anyone) followed me step by step to get to Day 36. In our Simple FPS game, we have reached a complete prototype so I’ve decided to make a Github Repository of our game before we start adding the VR components in. Specifically, after doing some manual cleanup work in Step 1. Now anyone can start following along to build our VR application. You can find my Simple VR First Person Shooter GitHub repository here. Step 1: (FAILED/SKIP) Clean Up Un-Used Assets If you haven’t tried switching to the Android platform in our simple FPS game, you might notice, that it… takes… FOREVER. I’ve tried building the app (and I waited 10+ minutes). Thank goodness for the Unity previewer, I would not know what to do if I had to build my app on my phone every single time! Luckily, I found that Google/Unity does have a solution for this. Google has Instant Preview. Unfortunately, which loads the game into our phone while also playing on the editor. The bad news is that this only appears to be for the Daydream Viewer so I’m going to hold off for now. However, let’s see what we can do to optimize this! When I say optimize, I really mean get rid of our un-used assets! Looking at what we have, we have 1 GB worth of files! That’s not good! IMPORTANT NOTE Unfortunately, this didn’t exactly work. I tried to export all our dependencies and then import it into a new project and there were some problems. It turns out, things like Layers and Tags do not get preserved so if we wanted everything to work, we had to manually add everything back in. Instead, I used the files I exported into a new project as a reference and then manually removed assets from a copy of our project (that’s what I get for not having source control!) Also from testing with a before and after build time, I believe that un-used assets DO NOT improve our build and runtime, so the only useful thing that came out of Step 1 was that I: Cleared some clutter so we can find files more easily now Made the project smaller so people can download it from Github faster, so not a complete loss! Step 1.1: Exporting our Assets Let’s get rid of our old un-used files! How? Well, a quick search on how to get rid of unused assets in Unity. All we need to do is: Select our scenes, which in this case is just Main Right-click and click: “Select Dependencies” Export our assets by going to Assets > Export Package… and save your package somewhere. Warning: This will not export Editor scripts and plugins, which in our current situation, is not a problem. Now at this point, we have 2 choices: Delete everything and re-import the assets that we exported or… Create a new project and import our exported project I’m going to do the 2nd choice and make a new project. Step 1.2: Importing our Exported Assets We’re going to create a new project and import everything we just exported. To do that: Select File > New Project… Call your file whatever you want, but I’m going to call mine First Person Shooter VR And now we’ll have a fresh new Unity project: Now we need to re-import everything we exported. Go to Assets > Import Package > Custom Package and find the .unitypackage that we just created Afterwards, just choose to re-import everything Step 2: Set Up Our VR Tools and Settings The next thing we need to do after we have our new project is that we need to import the Google VR SDK and configure our Unity. I’m going to be lazy and just refer us to Day 35: Setting Up Google Cardboard In Unity. Just do the exact same thing in downloading the SDK and setting up our Unity configuration. Note: In the repo, I already included Google VR SDK 1.100.1 and the necessary changes for Player Settings. I assume the PlayerSettings are project-based and not computer-based, but if it’s not, follow the instructions in Day 35. Step 3: Playing Our Game At this point, we should be almost ready to play our game! At this point, we have: imported the Google VR SDK Switched to the Android platform configured our PlayerSettings to the appropriate settings to run a Google Cardboard app The last thing we need to do that’s specific to our Game Project is that we try to build in Build Settings… we run into a problem to resolve incompatibilities between Color Space and the current settings. To fix this, we just need to change our Color Space from Linear to Gamma. To do that: Go to File > Build Settings > Player Settings > Other Settings > Color Spaces Change Linear to Gamma With this setting in, we’re ready to build our game. To do that, I recommend running the development build to build our app. Contrary to what the name sounds, development build DOES NOT make our build faster, instead it allows us to have access to useful debugging settings gives us access to the Profiler and Script Debugging. Now once you’re ready, let’s build our game! Make sure your phone is connected to your computer. Go to File > Build Settings… Enable Development Build Click Build And Run You can save the APK anywhere. Now enjoy the long wait~! Conclusion That’s it for today! Today we cleaned up a bit of our project and then set up the game so that we can run our app directly from our phone. The build is long and horrendous, which is unfortunate. There are a couple of solutions available, but I’m going to look at them some other day. We can also play the game directly from Unity. If we were to play the game right now, we’ll encounter problems. From a technical limitation: We can’t move We can’t reload anymore To actual technical problems: The enemies are all black color We don’t have any of our UI’s anymore I’ll have to investigate these things and solve them one at a time, but I can finally say, we’re here! We’re finally working in VR! That’s what we’re going to try and tackle the next couple of days. It’s going to be a fun one! Day 35 | 100 Days of VR | Day 37 Home
  24. Yesterday we looked at how we can work with VR and went through the basic demo and understood how everything worked. Today, we’re going to look at how we can install our game directly into the phone. To do everything today, we need to have: A phone that supports Android Level 19 (Kit Kat) A USB to Micro-USB (or Type-C for newer devices) cable (Optional) Google Cardboard Today we’re going to: Install the Android SDK so we can build and run our app Install a USB driver for our computer to detect our phone Set up our phone to be in developer mode Build and Run the demo app into our phone With all that being said, let’s get started! Today we’ll be following Unity’s Android SDK setup guide Step 1: Install the Necessary Android Software Since we’re building our VR app for Android applications, we need the required software to compile, build, and run our app on our phone. Download and install the latest Java SDK to run Android Studio Download and Install Android Studio You might have to restart your computer first for your computer to recognize the new Java SDK that you installed. When we’re done downloading and installing Android Studio (which will take a LONG time), we want to open the SDK Manager. In our new project, we can find our SDK Manager under Configure. Now we’ll get this: Under SDK Platform, select the platform we want to support, in this, case it’s Android 4.4 for Cardboard and Android 7.0 for DayDream, however, I believe if you install the newest version that’ll work for both. Under SDK Tools, install: Android SDK Platform-Tools Android SDK Tools Google USB Driver if you have a Nexus device With all of this, we should now have everything we need to be able to build our game into our Android device. Step 2: Install a USB Driver to Detect our Phone The next part (and probably the part I hate the most) is installing a USB driver that allows our computer to detect our phone. Go to Google’s documentation on where to find the appropriate OEM USB driver for your phone and install it. With any luck, your computer should be able to successfully recognize your phone when you plug it into your computer. If not, then I refer you to Google this problem as there are too many possibilities of what could have gone wrong. Step 3: Change Your Phone to Developer Mode Now our computer can connect to our mobile device, the final thing we need to do is have our phone be in developer mode so Unity (or Android) can create the app and install it on our phone. The instructions to enable Developer Mode varies depending on what your phone is. A quick Google search should give you what you need to enable it. However, the most common approach these days is to: Go to Settings > About phone > Build Number Click build number 7 times to enable Developer Mode Now under Settings, you should find Developer options. Go into Settings > Developer options and turn on USB Debugging Hopefully, with this step completed, we can finally move on to our configurations in Unity! Step 4: Configuring Unity to Build and Run our Android App Now that our phone is ready, it’s time to finally build our game into Unity. Make sure that your phone is connected to your computer In Unity go to File > Build & Run to create an APK file (our app) that will install it on our computer That’s it. Now in the perfect world, that’s it, we’re done. Enjoy our VR game! Unfortunately, there are always problems that we would encounter: Your API is at the wrong level. You’re missing a Bundle Identifier Failed to compile resources with the following parameters: major version 52 is newer than 51, the highest major version supported by this compiler. The 1st and 2nd problem can be resolved easily. The first problem is because we need to make sure that we create a minimum version of Android devices that have the software we need to run our VR application. In Player Settings under Other Settings… in Minimum API Level select API Level 19 for Google Cardboard support and API Level 24 for Google Daydream. If you choose API Level 24, just make sure that your phone can run Daydream! For the second problem, every Android app has a unique identifier that Google uses to identify the app. The error that we’re getting is that Unity is telling us that we’re using the default one and we should change it. In Player Settings under Other Settings… in Package Name change the string to be something else. Just make sure you follow the convention of <companyname>.<appname>. In our case, it doesn’t matter what it is, we can put anything we want. Now for the third and final problem. This one more interesting. Most likely your error is something like this: Failed to compile resources with the following parameters: -bootclasspath "C:/Users/JoshDesktop/AppData/Local/Android/android-sdk\platforms\android-24\android.jar" -d "C:\Users\JoshDesktop\git\Cardboard\Temp\StagingArea\bin\classes" -source 1.6 -target 1.6 -encoding UTF-8 "com\google\android\exoplayer\R.java" "com\google\gvr\exoplayersupport\R.java" "com\google\gvr\keyboardsupport\R.java" "com\google\gvr\permissionsupport\R.java" "com\google\vr\cardboard\R.java" "com\google\vr\keyboard\R.java" "com\Josh\Chang\R.java" "com\unity3d\unitygvr\R.java" warning: C:\Users\JoshDesktop\AppData\Local\Android\android-sdk\platforms\android-24\android.jar(java/lang/Object.class): major version 52 is newer than 51, the highest major version supported by this compiler. It is recommended that the compiler be upgraded. warning: C:\Users\JoshDesktop\AppData\Local\Android\android-sdk\platforms\android-24\android.jar(java/lang/AutoCloseable.class): major version 52 is newer than 51, the highest major version supported by this compiler. What all of this is saying is that our Java is out of date and we need to have at least Java SDK 8.52. In my case, I previously had 8.51 installed and when I installed version 8.52, Unity didn’t pick up on the changes. To fix this: Go to Edit > Preferences > External Tools under Android, select JDK and choose the path to your newest JDK file. For me, on my window machine, it was located at C:\Program Files\Java\jdk1.8.0_152 With all of this done, hopefully, you should be able to successfully build and run the GvrDemo on your phone + Google Cardboard if you have one. Conclusion Hopefully, this was a useful guide to getting your Android device set up to play the scene. Leave a comment if you run into problems and I’ll try to help and update this article with any new information. On a different note, it’s truly amazing playing with VR on our own mobile device. Just playing the VR game from Unity was interesting, but words can’t describe how much more realistic and interesting it becomes until you strap your phone onto your face! I think at this point, we have a good understanding of the basics and what is and isn’t possible with the Google Cardboard now. Tomorrow we’re going to look and see how we can incorporate the VR SDK into our simple FPS game to see how our game fairs in VR! Day 34 | 100 Days of VR | Day 36 Home
  25. Now that we have made a conscious decision to work in VR, today I finally had the chance to play around with VR in Unity. Today we’re going to explore setting up and using Google’s VR SDK. You might think that setting up VR would be an extremely complex process, but after going through the process, I can say that starting out is simpler than you would think! Here’s our objective for today: Setting up support for Google Cardboard on Unity Going through Unity’s Tutorial Let’s get started! Step 1: Setting up Google Cardboard on Unity For today, I’ll be going through Google’s documentation for setting up VR on Unity. The nice thing about the Google VR SDK is that we can re-use most of the prefabs and scripts that are used with Google Cardboard and use them with Google Daydream. That’s 2 different platforms for the price of one. Today I’ll be following Google’s official documentation on getting started on Unity. Step 1.1: Install Unity At this point, I’m going to assume that we all have an older version of Unity (5.6+). To support being able to run our VR App, we’re going to need to Android Build Support, if you don’t have that installed already, re-download Unity and during the installation process, choose to include Android Build Support. Step 1.2: Adding the Google VR SDK After we have Unity set up correctly with Android Build Support, we need to get Google’s VR assets. Download Google’s VR SDK here. We’re looking to download the .unitypackage. After we have the package downloaded, it’s time to add it to a Unity project. For our case, we’re going to create a new project to play around with. In Unity create a New Project (File > New Project…) Once inside our new project, import everything from the package that we downloaded. In Unity we can import by going to Assets > Import Package > Custom Package. Step 1.3: Configuring Unity to Run VR Now we have imported everything we need, the last thing to do is to change some of our settings in Unity so we can run our game. Change Our Build Setting to Build and Run Android Applications The first thing we need to do is get Unity to run our game project on an Android platform. Open the Build Settings by going to File > Build Settings. Select Android and hit Switch Platform Wait for the game to finish re-packaging our assets for our new platform Change Our Player Settings to Support VR The next thing we need to do is to change our Player Settings so that we can support the specific VR SDK that we want. In our case, it’s going to be Google Cardboard. In Build Settings, next to Switch Platform, we have Player Settings, select it. In Player Settings, enable Virtual Reality Supported and then add Cardboard to our Virtual Reality SDKs Finally, in Minimum API Level, select API level 19 for the minimum Android version the device the players must have. Google Cardboard requires a minimum of level 19 and the Google Daydream Viewer requires a minimum of level 24. Once we have everything installed, we can finally get started on taking a first look at working with VR! Step 2: Looking Through the Unity Tutorial Now that everything is configured, we can officially start looking through Google’s SDK Basics. I went through the SDK basics while also going through the GVRDemo scene. In our new project go to Assets > GoogleVR > Demos > Scenes and open GVRDemo Google provides prefabs and scripts that will take care of the VR features for you. These are all located in Assets > GooglveVR > Prefab and Scripts. Here’s a breakdown of what they and the script attached to them do: GvrEditorEmulator prefab– Allows us to control our camera like how we might control it with our headset. Hold on to the alt button to rotate your view around the camera. GvrControllerMain prefab – Gives us access to the Daydream controller which we can implement actions with Google’s controller API to interact with the game GvrEventSystem prefab – Enables us to use Google’s input pointer system. Specifically, how our gaze/controller interacts and selects objects. GvrPointerGraphicRacyater script – This script is like a normal Graphic Raycaster that we would attach on to a UI canvas so that we can interact with our UI using our input devices (gaze or controller) GvrPointerPhysicsRaycaster script – This script shoots out a raycast directly in the middle of our screen to select something when we decide to click. We should attach this to our main camera. We must also attach Unity’s event system on each object we want to interact with when we select them. GvrControllerPointer prefab – This is the Daydream’s controller. It gives us an arm asset to imitate our controller. This prefab must the sibling of our Main Camera object where we attached our GvrPointerPhysicsRaycaster GvrReticlePointer prefab – This is the Google Cardboard’s gaze controller. It creates a dot in the middle of our screen which we use to select objects that are in the game. For this prefab to work we must make it a child of the Main Camera game object. There are quite a couple of other prefabs and scripts, but on the high level, these are the basics we’ll need to make a VR game. Let’s see this in action with the GvrDemo scene! Step 2.1: Looking at the demo scene When we open up GvrDemo, here’s what we see: I suggest that you explore around the scene and see the objects in our hierarchy, but on the high-level summary, here’s what we have in our hierarchy that’s relevant to just the Google Cardboard (because it has Daydream assets too) GvrEditorEmulator for us to emulate head movement in VR GvrEventSystem for Unity to detect our VR inputs when we select an object Inside Player > Main Camera, we have our GvrPointerPhysicsRaycaster script which allows us to use Google’s raycasting system for 3D objects Inside the Floor Canvas game object, we have the GvrPointerGraphicRacyate for us to interact with the UI. Finally, inside Player > Main Camera > GvrReticlePointer, we have our gaze cursor for Google Cardboard that we use to interact with the game world. The main point of this game is to click on the cube that appears in the game. When we click on the cube, it’ll be randomly moved somewhere else in the game. The interesting part of all of this is how we can trigger the code with our Gaze. Let’s look at the Cube and Unity’s Event Trigger system. The Event Trigger System is a way for Unity to recognize any action taken on the game object that the Event Trigger is registered onto. An action is something like: OnPointerClick OnPointerEnter OnPointerExit In our example, OnPointerClick will be triggered whenever we click on an object that has the Event Trigger attached to it. Here’s the teleport script: // Copyright 2014 Google Inc. All rights reserved. // // Licensed under the Apache License, Version 2.0 (the "License"); // you may not use this file except in compliance with the License. // You may obtain a copy of the License at // // http://www.apache.org/licenses/LICENSE-2.0 // // Unless required by applicable law or agreed to in writing, software // distributed under the License is distributed on an "AS IS" BASIS, // WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. // See the License for the specific language governing permissions and // limitations under the License. using UnityEngine; using System.Collections; [RequireComponent(typeof(Collider))] public class Teleport : MonoBehaviour { private Vector3 startingPosition; public Material inactiveMaterial; public Material gazedAtMaterial; void Start() { startingPosition = transform.localPosition; SetGazedAt(false); } public void SetGazedAt(bool gazedAt) { if (inactiveMaterial != null && gazedAtMaterial != null) { GetComponent<Renderer>().material = gazedAt ? gazedAtMaterial : inactiveMaterial; return; } GetComponent<Renderer>().material.color = gazedAt ? Color.green : Color.red; } public void Reset() { transform.localPosition = startingPosition; } public void Recenter() { #if !UNITY_EDITOR GvrCardboardHelpers.Recenter(); #else GvrEditorEmulator emulator = FindObjectOfType<GvrEditorEmulator>(); if (emulator == null) { return; } emulator.Recenter(); #endif // !UNITY_EDITOR } public void TeleportRandomly() { Vector3 direction = Random.onUnitSphere; direction.y = Mathf.Clamp(direction.y, 0.5f, 1f); float distance = 2 * Random.value + 1.5f; transform.localPosition = direction * distance; } } We can ignore what the code does, but the important thing that I want to bring attention to are the public functions that are available: SetGazedAt() Reset() Recenter() TeleportRandomly() Where are these called? Well, if you look back at our Event Trigger that’s created in Cube, we set 3 event types: Pointer Enter Pointer Exit Pointer Click Then whenever any of these events occur, we call our public function. In this example, when we look at our cube, we’ll trigger the Pointer Enter event and call the SetGazedAt() function with the variable gazedAt to be true. When we look away, we trigger the Pointer Exit event and call the SetGazedAt() function with gazedAt to be false. Finally, if we were to click on the cube, we would trigger the Pointer Click event and call TeleportRandomly() to move our cube to a new location. Conclusion It’s surprising how un-complex this whole process is so far! I’m sure there are a lot more things to consider once we dive deeper into Unity, however for today, I think the progress we have made is sufficient. Tomorrow, we’re going to look at how we can get the demo app to run on a phone that supports a Google Cardboard (which I assume at this point is 99% of you guys here) Day 33 | 100 Days of VR | Day 35 Home
  • Advertisement