• Advertisement
Sign in to follow this  

Epic Games Releases Early Access for VR Mode on Mac

Recommended Posts

Epic Games has released early access support for developing VR applications on Mac platforms through the Unreal Engine GitHub repository. You can get access by going to the private VR on Mac branch (login required). Review the readme for instructions on how to get started.

At this year’s WWDC, Apple announced VR support for their upcoming version of OS X, High Sierra (macOS 10.13 and above). Epic Games took the stage with ILM to demonstrate the new functionality using the Unreal Engine in VR Mode on the new iMac Pro.

ull support for VR development on macOS will be in upcoming Unreal Engine 4 releases. Mac VR support, together with general Metal 2 support and wide-ranging Mac optimizations, will ship in Unreal Engine 4.18 binary tools (via the Epic launcher) starting with previews in September and with the full release in early October.

Read about SteamVR for macOS, including Valve’s beta release, here.


View full story

Share this post


Link to post
Share on other sites
Advertisement

Epic Games has released early access support for developing VR applications on Mac platforms through the Unreal Engine GitHub repository. You can get access by going to the private VR on Mac branch (login required). Review the readme for instructions on how to get started.

At this year’s WWDC, Apple announced VR support for their upcoming version of OS X, High Sierra (macOS 10.13 and above). Epic Games took the stage with ILM to demonstrate the new functionality using the Unreal Engine in VR Mode on the new iMac Pro.

ull support for VR development on macOS will be in upcoming Unreal Engine 4 releases. Mac VR support, together with general Metal 2 support and wide-ranging Mac optimizations, will ship in Unreal Engine 4.18 binary tools (via the Epic launcher) starting with previews in September and with the full release in early October.

Read about SteamVR for macOS, including Valve’s beta release, here.


View full story

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By RJSkywalker
      Hello, I'm trying to design a maze using a mix of procedural and manual generation. I have the maze already generated and would like to place other objects in the maze. The issue is the maze object is created on BeginPlay and so I'm unable to view it in the Editor itself while dragging the object to the Outliner. Any suggestions?
      I'm thinking of doing something in the Construction Script or the object Constructor but not not sure if that would be the way to go.
      I'm still getting familiar with the Engine code base and only have a little experience in Maya or Blender since I'm a programmer.
    • By Yosef BenSadon
      Hi , I was considering this start up http://adshir.com/, for investment and i would like a little bit of feedback on what the developers community think about the technology.
      So far what they have is a demo that runs in real time on a Tablet at over 60FPS, it runs locally on the  integrated GPU of the i7 . They have a 20 000 triangles  dinosaur that looks impressive,  better than anything i saw on a mobile device, with reflections and shadows looking very close to what they would look in the real world. They achieved this thanks to a  new algorithm of a rendering technique called Path tracing/Ray tracing, that  is very demanding and so far it is done mostly for static images.
      From what i checked around there is no real option for real time ray tracing (60 FPS on consumer devices). There was imagination technologies that were supposed to release a chip that supports real time ray tracing, but i did not found they had a product in the market or even if the technology is finished as their last demo  i found was with a PC.  The other one is OTOY with their brigade engine that is still not released and if i understand well is more a cloud solution than in hardware solution .
      Would there  be a sizable  interest in the developers community in having such a product as a plug-in for existing game engines?  How important  is Ray tracing to the  future of high end real time graphics?
    • By Terry Jin
      Hi everyone! 

      I am from an indie studio that has received funding for our concept and is ready to create the next generation 2D Pokemon-inspired MMORPG called Phantasy World. This ad is for a volunteer position but hopefully will transition into something more. Our vision is to create a game that draws inspiration from the series but is dramatically different in both aesthetics and gameplay as the work would be our own.
       
      We are hoping that you can help us make this a reality and are looking for game developers familiar with the unreal engine and would be happy to work on a 2D top down game. Sprite artists are also welcome as we are in desperate need of talented artists! Join our discord and let's have a chat! https://discord.gg/hfDxwDX

      Here's some of our in game sprites for playable characters while moving around the game world! Hope to see you soon!
       


    • By khawk
      Notes from the session.
      Rahul Prasad, Product Manager on Daydream - Daydream SDK, Platform, VR Mode, Daydream-ready.
      Why is mobile VR development hard?
      Need to guarantee:
      Consistent frame rates required High frame rates 90fps on desktop At least 60fps on mobile Low motion-to-photon latency Related to framerate Influenced by other systems on the device If we look at desktop VR, they have plenty of power, plenty of performance, and less worry over temperature constraints.
      In mobile, only get 4W of power (vs 500W), limited bandwidth, no fans with passive dissipation, and a market of mainstream users (vs hardcore gamers).
      Mobile VR systems are somewhere in the middle between the full hardware control of consoles and the wild west of general mobile and desktop development.
      Simplest solutions
      Build for lowest common denominator Build for exactly one device, try to bring the console model to mobile
        GPU Techniques for Mobile VR
      Assume ASTC exists on mobile VR devices - use large block size, always use mipmaps and avoid complex filtering Implement stereo specific optimizations - multiview when it exists, render distance geometry once Avoid submitting multiple layers - really expensive on tiled GPUs, compose multiple layers in your eyebuffer render loop prior to ARP Complex pixel shaders are costly - render particles to lower res buffer, use medium precision when possible Avoid large monolithic meshes - compact, efficient chunks; front-to-back rendering (rely on engines) Prefer forward rendering algorithms Spread work over multiple CPU threads - more threads running slowly consume less power than few threads running quickly Use MSAA - at least 2x/4x when possible, use GL_EXT_multisampled_render_to_texture, discard MSAA buffers prior to flushing Buffer management - avoid mid-frame flushes, invalidate or clear entire surfaces before rendering, discard/invalidate buffers not required for later rendering, single clear for double-wide eye buffers Type of Content Affects Your Approach
      Example: Youtube VR has very different memory, modem, GPU, and CPU patterns than a high performance game. Session times vary, latency tolerances are different.
      Allocate resources appropriate for the content.
      Thermal capacity is your "primary currency". Make tradeoffs based on type of app and thermal budget.
      Game session times 20-45 minute session time. Video, 1-3 hours of session time. Text, several hours of use important.
      Games - high GPU, medium CPU, high bandwidth, low resolution, low modem Video - low GPU, medium to high CPU, high bandwidth, medium to high resolution, high modem if streaming Text - low GPU, low CPU, high bandwidth, high resolution, low modem Bandwidth high across all use cases.
      Thermal management about tradeoffs:
      session time vs graphics spatial audio vs graphics streaming vs graphics 4k decode vs graphics Dynamic performance scaling to manage resources:
      Render target - scale with display resolution and content types Trade resolution for antialiasing - 2x/4x MSAA, consider dynamically adjusting Use modem judiciously - don't light up all available bandwidth, avoid streaming if possible Adjust framerate dynamically - don't let framerate float, snap to full rate or half rate, engines may help If CPU limited - lower spatial audio objects, simplify physics simulation Boost clock rates sparingly Technical Case Study - VR profiling with Systrace
      Comes in Android Studio Tool for profiling mobile Android devices (editor note: walking through case study of using Systrace to understand performance)
       
    • By khawk
      Epic released a new feature trailer today as part of their GDC announcements. They're giving several in-depth talks at GDC and showcasing the power of the Unreal Engine in their booth. You can see their full GDC plan here.
       
  • Advertisement