Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Everything posted by 8thWallDev

  1. 8th Wall’s latest integration is with Babylon.js, a WebGL-based graphics engine that provides top notch rendering for the web. Follow the video or the steps below to create an AR experience that runs in your mobile browser! https://medium.com/media/e848ca81b7e7dfe7a6a92d3664be6262/href1. Go to 8thwall.com and log into the 8th Wall Console (or sign up for free) 2. Make sure your “Web Developer” workspace is selected. 3. At the top right of the screen, click “Device Authorization” to authorize your phone to view experiences under development. Scan the QR code with your phone, and you’ll see the message: “Developer Mode On.” 4. On the Dashboard, click “Create a new web app.” Name your web app and click “Create.” 5. Copy 📋 your app key from the Dashboard 6. Go to the Quickstart page. Make sure you have NPM installed and download or clone our 8th Wall Web public GitHub repo. 7. In GitHub, download the entire contents of the repo as a ZIP file. Once download is complete, click to expand the zip file. 8. Add your app key into the project. Open up a text editor, and underneath “gettingstarted” you’ll see an “xrbabylonjs” directory. Go into that and open up the index.html file. Find the following line and replace the X’s with the app key you copied earlier. Save 💾 <script async src="//apps.8thwall.com/xrweb?appKey=XXXXXXXX"></script> 9. Open a terminal window if you are on a Mac, or if you’re using windows, open a standard command prompt (not PowerShell). Open the folder where the project and serve script are located. Inside you’ll see a “serve” directory. In the terminal window, type “cd” for change directory, and either type the full path to the serve/ directory, or simply drag the serve folder over and it’ll enter the path for you. Hit Enter. [tony@TonyMBP ~/Downloads/web-master]$ cd serve/ [tony@TonyMBP ~/Downloads/web-master/serve]$ 10. Run “npm install” and wait for the command to complete. This will install all of the node modules required to run the serve script on our computer. Once this is done, we’ll still be in the “serve” directory. # npm install 11. Go up one directory, back into web-master. On a Mac, type “pwd” to verify your current directory. On windows, type “cd” and hit enter to display your current directory: [tony@TonyMBP ~/Downloads/web-master/serve]$ cd .. [tony@TonyMBP ~/Downloads/web-master]$ pwd /Users/tony/Downloads/web-master [tony@TonyMBP ~/Downloads/web-master]$ 12. Run the “serve” script which sets up a local webserver on your computer. The command is slightly different if on Mac vs PC: On a Mac: ./serve/bin/serve -n -d gettingstarted/xrbabylonjs Hit enter to run the script. Once things have initialized, you’ll see a QR code you can scan to connect to the demo. Above the QR code you’ll also see the actual URL the QR code will take you to. If you are on windows, make sure you are starting from the web-master directory. Type: serve\bin\serve -n -d gettingstarted\xrbabylonjs Hit Enter. Scan the QR code on your screen. This will connect you to the local webserver running on your computer. 13. Grant camera permissions to see a basic Babylon JS demo. We will update this demo and load in a Flight Helmet 3D model provided by the Babylon JS team. 14. Go back to your index.html, and add a new script tag for a library that handles loading 3D models. <script src="https://preview.babylonjs.com/loaders/babylonjs.loaders.js"></script> Close out our script tag and save your file 💾 15. Edit index.js. Towards the top you’ll see an initXrScene() function which adds a few primitives to the scene; a sphere, cone, plane and box. Select all of those sections and delete. Leave the directional light and the line that sets the initial camera position. const initXrScene = ({ scene, camera }) => { const directionalLight = new BABYLON.DirectionalLight( "DirectionalLight", new BABYLON.Vector3(0, -1, 1), scene) directionalLight.intensity = 1.0 // < DELETE EVERYTHING BETWEEN THESE LINES > camera.position = new BABYLON.Vector3(0, 3, -5) } 16. Below, in the runRenderLoop() function, delete the box.rotation.y line or comment it out, leaving only the scene.render() call inside this function: engine.runRenderLoop(() => { // Render scene scene.render() }) 17. Copy 📋 and paste the BABYLON.SceneLoader.ImportMesh() code below (in bold) into the initXrScene() function. As a result you should have the following: const initXrScene = ({ scene, camera }) => { const directionalLight = new BABYLON.DirectionalLight( "DirectionalLight", new BABYLON.Vector3(0, -1, 1), scene) directionalLight.intensity = 1.0 BABYLON.SceneLoader.ImportMesh("", "https://models.babylonjs.com/", "flightHelmet.glb", scene, function (meshes) { meshes[0].scaling = new BABYLON.Vector3(.1, .1, .1) }) camera.position = new BABYLON.Vector3(0, 3, -5) } This code imports a 3D model into your scene and scales it down to 1/10th the original size. That’s it! Save your file 💾 and bring your terminal or command prompt window back to the front. Scan the QR code again, or reload the page on your browser if it’s still open. This time when the Web AR experience loads, it downloads the flight Helmet model and we can view it in AR! Is there another integration you’d like to see 8th Wall support? Hit us up on our public Slack channel or tweet us @the8thwall. Babylon.js + 8th Wall Integration: The Full Tutorial was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  2. 8thWallDev

    Release 11: Image Targets & More

    We’re not going to lie, we’re pretty excited about our latest release. It includes some important new features that we’ve been working really hard on for the last few months. Since we first announced 8th Wall Web, Image Targets have been the #1 requested feature from our community. 8th Wall Devs Slack Channel @The8thWall on Twitter We heard you loud and clear. But we didn’t want to roll out a new feature to you without it being perfect. So, we developed and tested… and tested… until we got it just right. Introducing: Image Targets Our latest update for 8th Wall Web allows AR experiences to now recognize and interact with 2D image files, bringing static content to life. Not only can your designated image target trigger an AR experience, but your content has the ability to track directly to it. Simply log into your 8th Wall Web Developer dashboard and click the new Image Target button on the web app you would like to activate: Then upload your images to the 8th Wall Cloud: Here, you can edit, test and manage your image target library for each web AR experience you create. Unlike other web-based image recognition technology, the image detection and tracking for 8th Wall Web is all performed directly on-device, in the mobile browser. This means better tracking performance and security for the user, as no camera data is sent up to the cloud. Secondly, image tracking can work in tandem with our SLAM engine, giving you the flexibility to design experiences that interact with image targets as well as the physical space around you. We can’t wait to see what you come up with! Introducing: Embeds Now you can embed an AR experience on any website. Just click on the 🔗 icon next to your web app, connect a URL, and then click on the Embed button. This will generate a preview of your customizable AR View button, along with the embed code: These new features are all immediately available on your 8th Wall Web Developer account: Embeds: Customizable HTML snippets for your webpage that route people directly to your web AR experience Shortlinks: Provide you with 11-character URLs for your 8th Wall Web apps 8.Codes: Small, stylized QR codes that work directly in your mobile camera or barcode scanner Now Supporting: Babylon.js We’re thrilled to announce the latest web AR integration that we support: Babylon.js by Microsoft, giving you even more freedom to develop using the JavaScript framework you are most comfortable with. Babylon.js is a simple and powerful JavaScript framework for building 3D games and experiences with HTML5, WebGL, WebVR and Web Audio. To get started with 8th Wall and Babylon.js, download a sample project from our public Github page. In addition to Babylon.js, 8th Wall Web supports Amazon Sumerian, Three.js and A-Frame. Is another integration you’d like to see 8th Wall support? Hit us up on our public Slack channel or tweet us @the8thwall. Click Here to view the full list of updates. 🙌 A big thank you to our developer community for consistently providing us with helpful insights and critiques on our software. We’re continuing to develop and iterate based on the valuable feedback that you provide us.
  3. Miller Lite Brings Packaging to Life With Web AR Campaign, No App Required Trigger built a Web AR experience using technology from 8th Wall Web and GeeneeIf you were lucky enough to order a Miller Lite over St. Patrick’s Day weekend, you might’ve noticed that something was different about the packaging this year. 🍀 Mixed reality agency Trigger used 8th Wall Web and Amazon Sumerian technology, along with Geenee, to create a mobile Web AR experience that brought Miller Lite’s festive bearded man mascot to life. Users could visit a page on the beer brand’s website, scan the bottle in front of them, and the green-clad man would appear to hop out of the bottle as a live, 3D character. The illustration on the beer bottle served as an image target, which activated the web-based AR experience when scanned. Users got to cheers the bearded man and listen to him play an Irish folk tune on his flute. 🍻 The flash campaign only lasted through St. Patty’s Day weekend, but you can catch a glimpse of it here: https://medium.com/media/8fb40ba3d430c8a25e0c743848489771/hrefMiller Lite Brings Packaging to Life With Web AR Campaign was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  4. If you’ve logged into the 8th Wall console lately, you’ve probably noticed things look a little different. We’re excited to announce an updated design for 8th Wall XR, as well as a brand new console for 8th Wall Web. And, we’re pretty jazzed about our latest product, AR Camera — a slimmed down version of 8th Wall Web which features an easy drag-and-drop interface. AR Camera (NEW!) ✨ Features + Free + Instant web AR and prototyping tool + Drag-and-drop interface (no coding required) + Supports multiple 3D models and animations + Adjust light intensity, distance, scale and animation speed within console + Custom banner and ability to link out + Includes camera interface with ability to capture and save photos of 3D models + Get started here Prototype and publish to the web instantly with AR Camera!8th Wall Web 💡 New Features + Revamped Web Developer Console + XR Extras provides a convenient solution for: Load screens and requesting camera permissions Redirecting users from unsupported devices or browsers with “You’re almost there!” messaging and prompts Runtime error handling Drawing a full screen camera feed in low-level frameworks like Three.js + Added public lighting and hit test interfaces to XrController + Other minor API additions Revamped Web Developer Console 🔧 Enhancements and Fixes + Improved app startup speed + Fixed a framework issue where errors were not propagated on startup + Fixed an issue that could occur with WebGL during initialization + Use window.screen interface for device orientation if available + Fixed a Three.js issue that could occur when the canvas is resized 8th Wall XR 💡 New Features + Revamped XR Developer Console Revamped XR Developer Console 🔧 Enhancements and Fixes + Better support for Android camera permissions in Unity 2018.3 Click Here to view the full list of updates Update Now These features and fixes will update automatically for 8th Wall Web. For 8th Wall XR, simply open your Unity project and navigate to Assets / XR / 8thWall XR. Click “Check for Updates” to upgrade to Release 10. Is there a new feature that you’d like to see incorporated into our next release? Leave a comment below or post in our Slack channel. Release 10: New Year, New Product, New Interface was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  5. Sony Pictures presents “Spider-Man™: Into the Spider-Verse” Web AR Experience powered by 8th Wall and Amazon Sumerian technologies and produced by Trigger.If you’ve ever wanted to snap photos with Spider-Man, now you can! 8th Wall is thrilled to have partnered with Trigger — The Mixed Reality Agency — and AWS Amazon Sumerian to produce the mobile web augmented reality (AR) experience for Sony Pictures’ Spider-Man™: Into the Spider-Verse. The Spider-Verse Web AR Experience is designed to allow users to easily jump directly into Spider-Man’s AR world on any smartphone, without having to download an app. There, they can interact with the crime-fighting superhero, take pictures with him, and instantly share with friends. AR for the mobile web is changing the way users connect with their favorite brands. Now, users can immediately pull a realistic digital character into their own environment and interact with it, creating a more authentic connection. “The Spider-Verse Web AR Experience demonstrates how established brands can dramatically enrich their customer experiences and better engage with their fans,” said Erik Murphy-Chutorian, CEO at 8th Wall. “Augmented reality allows consumers to dive deeper into the worlds of their favorite products and characters. Sony Pictures has deepened and enriched the Spider-Man experience with the innovative work they’ve produced with Trigger and 8th Wall, and powered by AWS. It’s the perfect example of how AR for the web is the best new medium for brands to make their content come to life while increasing the accessibility and interactivity of their stories.” Spider-Man™: Into the Spider-Verse hits theaters nationwide on December 14, 2018. Be part of the experience by visiting www.intothespiderverse-ar.com. Share your Spidey pics with us by tweeting @the8thwall 🕷 Sony Pictures Takes Augmented Reality Users “Into the Spider-Verse” With Mobile Web AR Experience was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  6. 8th Wall Web demo using Samsung Internet browser at SDC 2018.We had a blast demoing 8th Wall Web on the Samsung Internet browser at the Samsung Developer Conference last week. Digital robot JINI greeted attendees in our latest immersive web experience, JINI Cam, where users posed and snapped photos of him in augmented reality. JINI Cam running on Samsung InternetJINI Cam works on all mobile browsers, and we wanted to show off our AR magic using super-speedy Samsung Internet for Android. “8th Wall Web running in the Samsung Internet browser demonstrates what a modern mobile web browser is capable of today in the hands of highly skilled AR creators,” said Laszlo Gombos, the Senior Director of Web Platform at Samsung. “8th Wall enables AR for the web today running on today’s mobile browsers.” If you haven’t had a chance to try out the Samsung Internet browser yet, you can download it here. You can play with our interactive AR camera, JINI Cam, by opening this link on your tablet or smartphone. Make sure to share your favorite JINI pics with us by tweeting @the8thwall 📸 Interactive AR on the Samsung Internet Browser was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  7. 8thWallDev

    8th Wall XR Release 9

    It’s alive! 🤖 Release 9 is now active. Here’s what you can expect: New Features + All-new GPU-based pipeline delivers up to 4x faster tracking performance with improved tracking stability + High-resolution camera texture support (1080p) + Custom ARSessionDelegate feature allows ARKit-enabled devices to use the latest ARKit features + Building against XCode 10 with support for iPhone XR, XS, and XS Max + ARKit 2.0 support + ARCore 1.5 support + Added support for ARCore camera auto-focus + Added Pause(), Resume() and IsPaused() functions to the XRController API Enhancements and Fixes + Fixed a landscape-mode rendering issue on iPhone 5c Click Here to view the full list of updates. Update Now To take advantage of the Release 9 features and fixes, simply open your Unity project and navigate to Assets / XR / 8thWall XR, then click “Check for Updates” to upgrade to Release 9. New to 8th Wall? Learn how to create an account and get started with AR development here. 8th Wall XR Release 9 was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  8. We recently launched 8th Wall Web, the first solution of its kind for augmented reality (AR) in the mobile browser. As the 8th Wall Director of Engineering, I wanted to share with you why I’m so proud of the team, and why I’m really excited about what we accomplished. 8th Wall Web makes interactive AR for the web possible through three main components: 8th Wall’s SLAM engine, a high throughput camera application framework, and bindings with common web 3D engine frameworks. 8th Wall Web is built entirely on standard web technologies such as the JS MediaDevices API, Sensor API, WebGL and WebAssembly. AR for everyone At 8th Wall, we believe augmented reality is for everyone. We’re on a mission to make immersive AR content accessible to all, regardless of what device you’re using. We launched 8th Wall XR to grow the audience of AR apps. 8th Wall XR taps the full power of native AR frameworks when available, and expands beyond the 100 device models that ARKit and ARCore support today. AR apps powered by 8th Wall reach users on over 3,400 different types of phones and tablets. Our data shows this increases an app’s reach tenfold on Android. AR is an incredible new capability that will alter the way we interact with devices, with enormous room for growth. By optimizing viral flows, top-performing AR apps have the potential to reach an audience size 10,000 times larger than today. One reason for today’s limited virality is AR inoculation: if only 1 out of 10 users can experience AR apps, every fan of an app needs to share it with ten times as many friends in order for the app to go viral. We developed 8th Wall XR to lower the barrier to virality and help creators maximize the reach and success of their AR apps. We do this by making AR work for everyone. Another reason for limited virality in AR is app-installation friction. Many users will click on web links shared by their friends, but only a few will install an app. In a given month, most smartphone users will install zero apps, according to a 2017 report by comScore MobiLens. The web is a friction-free way for users to share and engage with content. 8th Wall’s browser-based AR demo, JINI, was used by 7 times as many people on its launch day than our first promotional native app. By bringing 8th Wall XR to the web for everyone, we believe we can help creators maximize their reach to new audiences with AR. AR for the web 8th Wall SLAM is a 6DoF markerless tracking system with feature point extraction, sensor fusion, robust estimation, triangulation, mapping, bundle adjustment and relocalization. 8th Wall SLAM is loaded as a browser script, just like any other web software. To create content for AR apps, we leverage 3D frameworks like WebGL, three.js, A-Frame, or Amazon Sumerian. Depending on the rendering system, 8th Wall Camera apps can run hosted in an external JavaScript animation run loop, in parallel with another JavaScript run loop, or in their own JavaScript run loop. The 8th Wall JS Camera apps framework orchestrates camera capture, computer vision processing, rendering, and display.Connecting the SLAM and app layers is the 8th Wall JavaScript Camera apps framework. This orchestrates camera capture, computer vision processing, rendering, and display. We designed it as a platform that can accommodate many types of AI-driven camera apps. For example, the 8th Wall Camera framework could drive apps that use face processing, hand detection, target image tracking, or object recognition. What we achieved Before we started building 8th Wall Web, we thought it would be impossible to achieve performant SLAM tracking using web-native technologies; but we also believed it would be the best way to help AR reach more users. What followed was the most daunting, technically challenging, and exciting experience of my career. https://medium.com/media/47cc1b02c06f4f001b5daf01fd3d7211/hrefWe started with our custom-built, highly-optimized mobile SLAM system. Over the course of the 8th Wall Web project, we exceeded what we thought was possible, further improving its performance in JavaScript by over 5 times, so that even on some phones released four years ago, we are able to achieve a 30FPS runtime. JavaScript benchmark performance of the 8th Wall SLAM JavaScript engine on iPhone 6, 6s, 7, and X by the date they were released.I’m incredibly proud of the 8th Wall team and what we achieved, and I wanted to share some of our experiences that made it possible. How we did it Build, measure, test, benchmark, repeat The 8th Wall XR for Unity package is produced from code written in C++, Objective C, Java and C#. It is built to run on multiple ARM and x86 architectures across Android and iOS, as well as within the Unity Editor on OSX and Windows. To manage this complexity, we created a highly customized set of crosstools and Starlark extensions for the Bazel build system. To adapt our Unity build for the web, we added further Bazel functionality to transpile C++ code to asm.js and WebAssembly, and to produce transpiled, minified, uglified JavaScript targets with NPM and C++ dependencies. We also created custom Bazel rules for cross-platform JavaScript tests. These allowed us to run our existing C++ unit tests and micro benchmarks in Node, Safari and Chrome, both on desktop and on mobile phones directly from the command line. 8th Wall SLAM is instrumented from the ground up with custom efficient profiling code that measures the speed of all subsystems. This leverages C++ scoping rules to generate call traces on every frame processed by our computer vision engine. These are then distilled into compact statistics that can provide feedback on performance of fine-grained sections of code across runs and devices. To optimize for the web, we created a JavaScript benchmark suite that served as an end-to-end performance and quality integration test. A key challenge for the benchmark suite was loading large datasets quickly to promote rapid iteration and evaluation. To achieve rapid benchmarking, we built a custom JavaScript file system to overcome browser cache, database, and memory limitations. In addition to evaluating changes at development time, we developed infrastructure to run our benchmark suite every hour on test devices and to log results to a server. This allowed us to quickly identify the source of performance or quality regressions. Optimize, optimize, optimize Once we had visibility into the performance and quality of our code, we began searching for areas where optimization was likely to have an impact. Some of these were amenable to classic optimization of algorithms and data structures. For example we developed a new custom sorting algorithm for choosing the subset of feature points to process for mapping. Some optimizations were known and reported in the computer vision literature, e.g.customizing cost function derivatives for bundle adjustment. Other optimizations were novel. For example, we were able to achieve a 10x retrieval speed up for locality sensitive hashes compared to standard libraries by optimizing for our specific operation domain. We also optimized sections of C++ code to perform well specifically in transpiled JavaScript. For example, we found that the JavaScript execution was particularly slow for sections of C++ code that read from one buffer while writing to another. Rewriting core algorithms to minimize this load-store pattern helped some sections of code significantly. Two versions of the same code. The second runs 15% slower in compiled C++, but runs 35% faster when transpiled to JavaScript: https://medium.com/media/8321d816e51c8fc034d1ac9ad062c974/hrefhttps://medium.com/media/5a504e3fa9dc909a8a221b7d884ac945/hrefAnother finding was that calling C++ methods from inner loops caused many temporary variable allocations triggering frequent JS garbage collection events. Inlining a few critical methods in C++ led to a substantial reduction in JavaScript memory management overhead. Rearchitect for the web In native C++ code, our SLAM engine uses C++11 threads, atomics, conditions and memory barriers to utilize multiple CPU cores available on modern mobile devices to speed up processing. In JavaScript, only one thread is available to the web application. This property forced us to fundamentally rethink how the overall flow of a camera application should be architected. We moved as much processing as possible from the CPU to WebGL in order to maximize parallel computation within the constraints of JavaScript. One example was feature point detection. This came with an unexpected benefit: when feature point extraction runs on the CPU it must follow certain algorithmic constraints that make it run very efficiently. On the GPU, different performance characteristics allowed us to improve the robustness and quality of feature detection in ways that would have been prohibitive before. In the browser, all computer vision, game logic, and rendering must be done within a single animation frame. We found browsers were more effective at making parallel use of the GPU in certain parts of this cycle. This forced us to modularize code into blocks that could be easily rearranged to run in different portions of a single animation frame to maximize efficiency. It was important to us that 8th Wall Web runs well on as many devices as possible. To keep performance high on older device classes, we built algorithms that detect performance hiccups and dynamically restructure the processing pipeline to spread cpu load among multiple animation frames. What’s next These are only a few of the changes that allowed us to bring AR technology to the web. Over the course of this project, we improved the speed of our SLAM engine by 5 times while also improving tracking quality, but we’re not done yet. We’re hard at work to integrate these same great updates into 8th Wall XR for native apps, as we continue to improve the way people experience AR across devices. We are committed to our goal of bringing AR to everyone. We look forward to pushing the boundaries of what’s possible, and to help creators bring their unique voices and amazing experiences to wider audiences than ever. How We Engineered AR for the Mobile Browser with 8th Wall Web was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  9. JINI, the alien robot hologram is here to grant you three wishes.It’s official! 8th Wall’s AR technology can now run on any mobile web browser, and we couldn’t be prouder. Check it out for yourself: We’re extending AR from the app marketplace to the web. With 8th Wall Web, immersive AR content that could previously only be experienced by downloading an app can now run directly on any existing website. That means that if you have a smartphone with a camera and a web browser, you’ve got AR. This is kind of a big deal. Brands, for the first time, will be able to offer seamless interactive AR experiences on their websites. With 8th Wall Web, industries including ecommerce, advertising, gaming, education and entertainment can now reach billions of people within their existing mobile web browsers. The tech that brought JINI to life: Built entirely using standards-compliant JavaScript and WebGL, 8th Wall Web is a complete implementation of 8th Wall’s Simultaneous Localization and Mapping (SLAM) engine, hyper-optimized for real-time AR on mobile browsers. We’re thrilled to be partnering with some innovative brands who will be the first to bring AR to the mobile web, and we can’t wait to show you our progress. If you’re already using 8th Wall XR, our cross-platform AR developer tool for mobile apps, expect to see some major optimizations to our SLAM engine with Release 9, coming this fall. Try out the JINI demo for yourself and let us know what you think! Tweet us @the8thwall or leave a comment below. Introducing 8th Wall Web: Augmented Reality That Lives in Your Browser was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  10. 8thWallDev

    8th Wall XR Release 8

    Release 8 is now live and active! 🎉 Here’s what you can expect: New Features + Support for ARCore 1.3 with Vertical Surfaces and Image Targets + Support for XR Remote in non-AR scenes + Added ability to use 8th Wall SLAM on ARKit & ARCore devices + Added ability to set initial camera facing direction + Added ability to increase camera field of view in non-overlay scenes Enhancements and Fixes + Improved 8th Wall SLAM accuracy with enhanced Bundle Adjustment and Point Triangulation + Fixed Surface Controller mesh orientation for deformable meshes + Improved behavior of all XRSurfaceController option combinations + Miscellaneous bug fixes and improvements Click Here to view the full list of updates. Update Now To take advantage of the Release 8 features and fixes, simply open your Unity project and navigate to Assets / XR / 8thWall XR, then click “Check for Updates” to upgrade to Release 8. New to 8th Wall? Learn how to create an account and get started with AR development here. 8th Wall XR Release 8 was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  11. Whether you’re a seasoned mobile app developer looking to dabble in AR, or a designer exploring the world of 3D, this tutorial is for you. https://medium.com/media/8f66ed2cf273499b7b4fd4e75e2c04eb/href Sign up for 8th Wall. Go to 8thwall.com to create your free account. Download your tools: Xcode (or Android Studio), Unity®, 8th Wall XR for Unity®, and the Demo Unity Project can be found here. Add 8th Wall XR to Unity®. Uncompress xr-unity-master.zip or find the cloned xr-unity repository. Within that folder, open the Demo Unity Project: open projects/8thWallXR-Demo/Assets/Scenes/Main.unity After Unity loads, select the Demo Unity Project to open it: You may receive this warning when opening the project, just click Continue: Download 8th Wall XR for Unity and add it to your Unity project by opening the .unityplugin. A progress bar will appear as it's loaded. Once finished, a window will display the contents of the XR package. Leave all of the boxes checked and click 'Import:' 4. Generate your app key. Inside your 8th Wall Dashboard, click on Get an App Key, then +Create a New App Key. This will be the bundle identifier for the app you will be building. Add your App Key to the Unity Project: On the Applications page, copy the App Key for your application In Unity, go to the XR → App Key Settings Paste your key into the App Key field Verify that the bundle identifier of your app matches com.your-company-name.XRDemoApp Go to File -> Save Project to save these settings. IMPORTANT: Before moving to the next step, please navigate away from the XRAppSettings panel. Simply select any asset or GameObject in your scene (i.e Click on Main Camera). There is a race condition in Unity related to AssetDatabase.SaveAssets() that can cause Unity to crash. 5. Build and run your AR app. In Unity, go to File → Build Settings and click “Add Open Scenes” Under Platform, select either iOS or Android Click “Build and Run” And, voila! You’ve just built your very own Augmented Reality app 🙌🏾 Follow the full tutorial here to learn how to manipulate features and controllers. How to build an Augmented Reality app in less than 15 minutes was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  12. Written by Erik-Murphy Chutorian, Founder and CEO of 8th Wall. This story was previously published on The Next Web. This year will be the year that we see the first applications designed for an AR-powered, mobile camera-first world. I’ve heard many discussions about how to accelerate AR development and adoption, but will everyone’s panacea be the cloud? While I believe cloud services will be essential to AR’s success and play a big role in its evolution, contrary to popular belief, cloud isn’t the gating factor. In my view the roadblocks to AR adoption are around design, reach and instructing people how to rely on what AR can do. Let’s take a step back and look at how interfaces have evolved. Computers have gained the ability to see and hear, and these virtual senses will usher in new era of natural user interfaces. Touchscreens will soon be on the way out, in the same way that the keyboard and mouse are now. I see the writing on the wall. We snap photos instead of texting, ask Alexa for our news and weather and can e-shop by seeing how products fit in our homes. As environmental understanding becomes more sophisticated on our phones, I believe user interfaces will start to interact with that environment. However, the cell phone is an accidental user interface for AR, and as such, we have to bridge the form factor and experience with something that is familiar to how people already use their phones. Not everyone will use the term ‘AR’ to describe what is going on, but I trust that 2018 will be the year consumers experience AR and it’s going to happen on their mobile phone. The best part of mobile AR? There’s nothing to strap on your face — just hold up a phone and open an app. Before we give up our touchscreens, much more will need to happen and contrary to previous discussion in the industry, a lack of specialized cloud services is not what’s holding back the transition to these camera-centric AR apps. The big hurdle to overcome is getting acclimated to designing with the perspective of AR as the first medium, as opposed to a secondary or add-on channel or feature. AR-first design will be key to creating successful everyday apps for new, natural types of user interaction. How did we get to mobile AR? The hype around Mobile AR began last September when Apple launched ARKit, now one of a handful of new software libraries that allow mobile developers to add augmented reality features to standard phone apps. These libraries offer virtual sensors that provide information about the environment and precisely how a phone is moving through it. For mobile developers, it means an opportunity to be first to design and build new intuitive user experiences that can disrupt how we interact with our phones. In the same way that desktop websites were redesigned for a mobile-first world, we will soon see that camera-enabled physical interactions become the norm for interacting with many types of apps, including ecommerce, communication, enterprise, and gaming. Where are the killer apps? People use their phones for email, news, communication, entertainment, shopping, navigation, gaming, and photography. Mobile AR isn’t going to change that. More likely, many of the killer AR apps will be very same apps we already use today afterthey have redesigned for AR. Companies that are slow to embrace this technology will be ripe for disruption. It’s happening already. Snapchat was first into the AR space and redefined how a younger generation communicates. Facebook and Google followed suit, and now Amazon, IKEA, Wayfair, and others are dipping their toes into the pool of AR. Niantic recently acquired a new AR start-up too, can we hope to see the physical world merge with the Wizarding world? What startups will innovate where the incumbents are slow to change? Will 2018 bring us a successor to maps, email, or photos? The AR Cloud is not the missing piece Modern apps rely on internet connectivity, big data, and location to round out their functionality. AR apps are no different in this respect, and in the same way we use Waze and Yelp to provide local, crowdsourced information about our environment, we will continue to do so when these apps are rebuilt for AR-first design. In today’s tech-speak, the ‘AR Cloud’ is a set of backend services built to support AR features like persistence and multiplayer. These services consist of distributed databases, search engines, and algorithms for computer vision and machine learning. Most are well-scoped engineering projects and their success will be in their speed of delivery and quality of execution. Technology behemoths and AR startups are competing to build these cloud services, with some of the heavily invested players going so far as to say “Apple’s ARKit is almost useless without the AR Cloud.” In contrast, the reality is quite the opposite. A single, well-designed AR mobile app can succeed immediately, but AR Cloud solutions can’t gain traction until enough top mobile apps are designed for AR. Ensuring that happens quickly is critical to their success. AR is limited today by a lack of design principles We need to think about how to design for mobile AR and specifically mobile apps for this new camera-first world. How do we break away from swipes, 2D-menus and the like, now that we can track precisely and annotate real objects in the world? AR technology has created an entirely new set of options for how we can interact with our phones, and from this we need to design AR interactions. To better understand how we should think about AR-first design, my team and I recently conducted an AR User Study to understand peoples’ experiences with the first crop of mobile AR apps. This resulted in the following AR interaction guidelines, which are by no means an exhaustive list: Prefer curvilinear selection for pointing and grabbing. By using a gentle arc instead of a straight line, people can select distant objects without their cursor jumping as it gets nearer to the horizon. Keep AR touch interactions simple. Limit gestures to simple one-hand operations, since one hand is dedicated to holding and moving the phone. Avoid Dwell clicking, e.g., hovering on a selected object for a period of time, as this selection mechanism is slow and generally leads to unintended actions. Initialize virtual objects immediately. People expect AR apps to work seamlessly, and the surface calibration step found in many ARKit apps is an interaction that breaks the flow of the application. Ensure reliability. Virtual objects should appear in consistent locations, and being able to accurately select, move and tether objects is important if these interactions are provided. AR apps need to balance fixed on-screen UI with in-context AR UI. Users shouldn’t need to “hunt” for UI elements in their environment. Before we can capitalize on cloud features for AR, we need to determine how to implement these and other guidelines into a uniform set of user interactions that are natural and fluid. Looking forward to the year of mobile AR I feel strongly that 2018 will be the year that we see the first applications designed for an AR-powered, camera-first world. The first developers to build them have a strong first-mover advantage on the next generation of applications, communication platforms, and games. I believe cloud services will be essential to this success and play a big role in its evolution, but my view is that other challenges remain around design, reach and instructing people how to rely on this new technology. In the true tradition of mobile technology, it won’t take long before new startups and tech behemoths defy what everyone once thought was or was not possible in this space. It’s not an AR Cloud or a killer new use case that will make AR successful. My take is that AR-first design, where we prioritize the AR experience over traditional 2D interfaces, will be the key to unlocking mobile AR. The first developers to build these apps will have a strong first-mover advantage on the next generation of applications, communication platforms, and games. Design is the only thing holding mobile AR back was originally published in 8th Wall on Medium, where people are continuing the conversation by highlighting and responding to this story. View the full article
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!