• Advertisement
Sign in to follow this  

UBM Releases the VR/AR Innovation Report for VRDC

Recommended Posts

UBM's VRDC surveyed over 600 professionals in the VR/AR/MR space to get developers' perspectives on the industry and the road ahead. The report findings include:

  • VR will be a profitable, sustainable industry in the mid- to long-term
  • Rise in popularity of HTC Vive and Oculus Rift among developers
  • Platform exclusives are becoming a bit more common

Also of interest is the drop in external funding from angel investors and VC's, and the belief that AR will be more popular than VR in the long-term.

Learn more at http://reg.vrdconf.com/VRDC-2017-Innovation-Report.

 


View full story

Share this post


Link to post
Share on other sites
Advertisement

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By khawk
      Notes from the session.
      Rahul Prasad, Product Manager on Daydream - Daydream SDK, Platform, VR Mode, Daydream-ready.
      Why is mobile VR development hard?
      Need to guarantee:
      Consistent frame rates required High frame rates 90fps on desktop At least 60fps on mobile Low motion-to-photon latency Related to framerate Influenced by other systems on the device If we look at desktop VR, they have plenty of power, plenty of performance, and less worry over temperature constraints.
      In mobile, only get 4W of power (vs 500W), limited bandwidth, no fans with passive dissipation, and a market of mainstream users (vs hardcore gamers).
      Mobile VR systems are somewhere in the middle between the full hardware control of consoles and the wild west of general mobile and desktop development.
      Simplest solutions
      Build for lowest common denominator Build for exactly one device, try to bring the console model to mobile
        GPU Techniques for Mobile VR
      Assume ASTC exists on mobile VR devices - use large block size, always use mipmaps and avoid complex filtering Implement stereo specific optimizations - multiview when it exists, render distance geometry once Avoid submitting multiple layers - really expensive on tiled GPUs, compose multiple layers in your eyebuffer render loop prior to ARP Complex pixel shaders are costly - render particles to lower res buffer, use medium precision when possible Avoid large monolithic meshes - compact, efficient chunks; front-to-back rendering (rely on engines) Prefer forward rendering algorithms Spread work over multiple CPU threads - more threads running slowly consume less power than few threads running quickly Use MSAA - at least 2x/4x when possible, use GL_EXT_multisampled_render_to_texture, discard MSAA buffers prior to flushing Buffer management - avoid mid-frame flushes, invalidate or clear entire surfaces before rendering, discard/invalidate buffers not required for later rendering, single clear for double-wide eye buffers Type of Content Affects Your Approach
      Example: Youtube VR has very different memory, modem, GPU, and CPU patterns than a high performance game. Session times vary, latency tolerances are different.
      Allocate resources appropriate for the content.
      Thermal capacity is your "primary currency". Make tradeoffs based on type of app and thermal budget.
      Game session times 20-45 minute session time. Video, 1-3 hours of session time. Text, several hours of use important.
      Games - high GPU, medium CPU, high bandwidth, low resolution, low modem Video - low GPU, medium to high CPU, high bandwidth, medium to high resolution, high modem if streaming Text - low GPU, low CPU, high bandwidth, high resolution, low modem Bandwidth high across all use cases.
      Thermal management about tradeoffs:
      session time vs graphics spatial audio vs graphics streaming vs graphics 4k decode vs graphics Dynamic performance scaling to manage resources:
      Render target - scale with display resolution and content types Trade resolution for antialiasing - 2x/4x MSAA, consider dynamically adjusting Use modem judiciously - don't light up all available bandwidth, avoid streaming if possible Adjust framerate dynamically - don't let framerate float, snap to full rate or half rate, engines may help If CPU limited - lower spatial audio objects, simplify physics simulation Boost clock rates sparingly Technical Case Study - VR profiling with Systrace
      Comes in Android Studio Tool for profiling mobile Android devices (editor note: walking through case study of using Systrace to understand performance)
       
    • By mc_wiggly_fingers
      Is it possible to asynchronously create a Texture2D using DirectX11?
      I have a native Unity plugin that downloads 8K textures from a server and displays them to the user for a VR application. This works well, but there's a large frame drop when calling CreateTexture2D. To remedy this, I've tried creating a separate thread that creates the texture, but the frame drop is still present.
      Is there anything else that I could do to prevent that frame drop from occuring?
    • By hyperknot
      Hi, first post here. I'm making a simple Augmented Reality game from the known 2D puzzle game Slitherlink or Loopy. This will be the first time I'm using shaders, so I'm on a bit of a steep learning curve here.
      My concept is that AR will look really nice with self illuminating objects, instead of normal materials where the shadows would be missing or wrong, as would be quite striking when composited to the camera feed.

      So I'd like to make the game as "laser-beams" levitating above the table, which is technically saying displaying and illuminating using tube lights. This is where I'm stuck.

      I've implemented smooth 2D line segment rendering by creating rectangles perpendicular to the camera and shading them in the fragment shader.

      I also looked into area lights, but all I could come up with was just "getting the nearest point in a rectangle" concept, which is:
      - looking nice on diffuse as long as it's a uniform color
      - but is totally wrong for Blinn and Phong shading

      My biggest problem is how to get the tube light illumination effect. Instead of the uniform white area on the screenshot below, I'd like to get colored, grid-like illumination on the ground. The number of tube lights can be up to 200.

      My only idea is to render to a buffer from a top orthogonal projection, apply gaussian blur and use it for diffuse lighting on the floor. Does this sound reasonable?

      Also, does anyone know how to get spectacular reflections right with an area light? Nothing PBR, just a Blinn would be nice. 

      The scene is very simple: floor on 0, all lights in the same height and only the floor needs to be lit.



      My shader (Metal, but pretty much 1:1 GLSL):
      fragment float4 floorFragmentShader(FloorVertexOut in [[stage_in]], constant Uniforms& uniforms [[buffer(2)]], texture2d<float> tex2D [[texture(0)]], sampler sampler2D [[sampler(0)]]) { float3 n = normalize(in.normalEye); float lightIntensity = 0.05; float3 lightColor = float3(0.7, 0.7, 1) * lightIntensity; // area light using nearest point float limitX = clamp(in.posWorld.x, -0.3, 0.3); float limitZ = clamp(in.posWorld.z, -0.2, 0.2); float3 lightPosWorld = float3(limitX, 0.05, limitZ); float3 lightPosEye = (uniforms.viewMatrix * float4(lightPosWorld, 1)).xyz; // diffuse float3 s = normalize(lightPosEye - in.posEye); float diff = max(dot(s, n), 0.0); float3 diffuse = diff * lightColor * 0.2 * 0; // specular float3 v = normalize(-in.posEye); // Blinn float3 halfwayDir = normalize(v + s); float specB = pow(max(dot(halfwayDir, n), 0.0), 64.0); // Phong float3 reflectDir = reflect(-s, n); float specR = pow(max(dot(reflectDir, v), 0.0), 8.0); float3 specular = specB * lightColor; // attenuation float distance = length(lightPosEye - in.posEye); float attenuation = 1.0 / (distance * distance); diffuse *= attenuation; specular *= attenuation; float3 lighting = diffuse + specular; float3 color = tex2D.sample(sampler2D, in.texCoords).xyz; color *= lighting + 0.1; return float4(float3(color), 1); }  
    • By myieye
      I'm near to starting an AR project that needs to support both Android devices and the HoloLens. I'm planning a tabletop game with minimal tracked objects and likely a LeapMotion to recognize hand gestures. What SDK (or combination of SDKs) exist that support both of these devices with (as much as possible) a single codebase? 
      I'm very new to Unity and game development. From my reading, it sounds like the Vuforia+Unity+Windows 10 SDK combo should do it, but it's not clear to me how much platform specific code would be involved or if I would need two entirely separate code projects.
      I'd also appreciate any intuition related to:
      how these SDKs relate to each other (eg. what does Vufori add to Unity) what sort of code is device specific other recommendations for SDKs that support both platforms
    • By i3di
      Heroes & Legend is an epic fantasy role play game that I believe will set new standards in role playing.  Featuring a rich audio score of over 50+ sound tracks.  Currently the game has four on board developers, designers, programmers, and music composers.  I am taking on eight more people that are dedicated.  You may view six of our audio scores in our first audio enticement video here:
      Vimeo:
      YouTube:
       
      Gamedev - Project:
      Gamedev - Blog
      Company Website - Under Construction
      http://www.i3dix.com
       
      Summary:
      We are looking for artists familiar with iClone7, iClone Character Creator 2, Blender, PBR, and the 3D Exchange Pipeline.  We also need some entry level to proficient Unreal 4 Developers who can help in getting World Max - I, our premier procedural world generator out on Unreal 4 market.  This procedural world generator utilizes a 2D Vector Database for SRTM and Natural Earth database importing or creating large worlds smaller or larger than earth with latitude, longitude, Bathymetry and topographical overlay.  I am a 25+ year software engineer in charge of the company and would be considered a 10+ year software engineer.
      Also, I am taking on a couple positions for Java Developers to create our companies premier Content Management System complete with a Web Hosting Module, Project Management Module, Subversion Module, interacting on a Linux System with a LAMP + T setup, subversion repository, who can also manage Network Administration.  Assets are synched via one-drive, projects maintained via private GitHub repositories.  
      Will train applicants, so don't be a cissy, try me, and let's see if you have what it takes to be a partner.
      All potential partners must sign a company NDA (Non-Disclosure Agreement) and Business Agreement via DocuSign.  Profit share on all products is 0.025% of the 10% quarterly budgeted net for payment.  If you think you have what this takes, then I look forward to hearing from you. 
  • Advertisement