Sign in to follow this  
Deadly_kom

DX11 The native side of Hololens

Recommended Posts

Deadly_kom    143

Good time of the day, members of the forum!
    I do not think that many have tried themselves in the development under
 Hololens, but nevertheless decided to write. In any case, all work with
 Hololens is reduced to UWP and DirectX 11. So I wrote a small prototype
 for easy initialization of primitives, etc. In the scene, and began testing
 on the piece of iron ... And I was very upset by the performance tests.
       
    A short excursion, for those who are not in the subject: - the construction
of an image under Hololens, naturally a stereo image for this reason I use
 DrawInstanced with pre-prepared shaders. Shaders are very simple, conversion
 to world coordinates and projection of the species, color is set straight in
 the shader. Plus to switch the render in 2D Texture Arrays I use the geometry
 shader, that's basically all.

   For the test, I generated a grid of 10k - 30k and output them about 10
 times ... for 10k grid and 10 calls - 30fps, for the last about 18 - 20fps.
 Sadness, I thought, and I decided that my hands are not growing in size and
 my brain does not want to work anymore ...
   I decided to look for flaws in my shit ... I measured the time for drawing
10 calls - about 800 ticks, it's kind of not that bad. Slightly optimized the
sorting by materials, namely, switching between shaders minimized but in the
 loop there were updates of constant values for the shader namely transform for
each object. Fps grew by 2-3 but the time to render the frame did not measure ...
   Although Unity seems to hold the bar 1.2 - 1.3 mm triangles at 10-15 fps ...
I myself did not check the word for people.

    So, maybe someone can decide what can be crooked and what can be patched up ...

 

P.S.  The buffer of depth is 16 bit, and I use similar indices. Thank you in advance.

Edited by Deadly_kom

Share this post


Link to post
Share on other sites
Deadly_kom    143
12 minutes ago, paultrott said:

would love to read this novel prose, but it appears to be sunk into some kind of eyeFrame with annoying scrollbars

So it's better?

Here's a video with 60k triangles

Share this post


Link to post
Share on other sites
unbird    8338

He meant not to use the 'Formatted' font (it's for code anyway), e.g. like so

Quote

Good time of the day, members of the forum!
    I do not think that many have tried themselves in the development under
 Hololens, but nevertheless decided to write. In any case, all work with
 Hololens is reduced to UWP and DirectX 11. So I wrote a small prototype
 for easy initialization of primitives, etc. In the scene, and began testing
 on the piece of iron ... And I was very upset by the performance tests.        
    A short excursion, for those who are not in the subject: - the construction
 of an image under Hololens, naturally a stereo image for this reason I use
 DrawInstanced with pre-prepared shaders. Shaders are very simple, conversion
 to world coordinates and projection of the species, color is set straight in
 the shader. Plus to switch the render in 2D Texture Arrays I use the geometry
 shader, that's basically all.
   For the test, I generated a grid of 10k - 30k and output them about 10
 times ... for 10k grid and 10 calls - 30fps, for the last about 18 - 20fps.
 Sadness, I thought, and I decided that my hands are not growing in size and
 my brain does not want to work anymore ...
   I decided to look for flaws in my shit ... I measured the time for drawing
 10 calls - about 800 ticks, it's kind of not that bad. Slightly optimized the
 sorting by materials, namely, switching between shaders minimized but in the
 loop there were updates of constant values for the shader namely transform for
 each object. Fps grew by 2-3 but the time to render the frame did not measure ...
   Although Unity seems to hold the bar 1.2 - 1.3 mm triangles at 10-15 fps ...
 I myself did not check the word for people.

    So, maybe someone can decide what can be crooked and what can be patched up ...

fixed.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By noodleBowl
      I've gotten to part in my DirectX 11 project where I need to pass the MVP matrices to my vertex shader. And I'm a little lost when it comes to the use of the constant buffer with the vertex shader
      I understand I need to set up the constant buffer just like any other buffer:
      1. Create a buffer description with the D3D11_BIND_CONSTANT_BUFFER flag 2. Map my matrix data into the constant buffer 3. Use VSSetConstantBuffers to actually use the buffer But I get lost at the VertexShader part, how does my vertex shader know to use this constant buffer when we get to the shader side of things
      In the example I'm following I see they have this as their vertex shader, but I don't understand how the shader knows to use the MatrixBuffer cbuffer. They just use the members directly. What if there was multiple cbuffer declarations like the Microsoft documentation says you could have?
      //Inside vertex shader cbuffer MatrixBuffer { matrix worldMatrix; matrix viewMatrix; matrix projectionMatrix; }; struct VertexInputType { float4 position : POSITION; float4 color : COLOR; }; struct PixelInputType { float4 position : SV_POSITION; float4 color : COLOR; }; PixelInputType ColorVertexShader(VertexInputType input) { PixelInputType output; // Change the position vector to be 4 units for proper matrix calculations. input.position.w = 1.0f; // Calculate the position of the vertex against the world, view, and projection matrices. output.position = mul(input.position, worldMatrix); output.position = mul(output.position, viewMatrix); output.position = mul(output.position, projectionMatrix); // Store the input color for the pixel shader to use. output.color = input.color; return output; }  
    • By Armantium
      Lip-syncing and believable facial expressions.
      I have Maya licence for 3 years, but I haven't explored this aspect yet, as I've focused on mechanistic animations so far. I was wondering if there is a specialized software outside of Maya  that would make this very easy and quick. I would only need it for 3-5 characters, but I suppose I could use the exact same facial animation mechanism for all of them.
      If you are an experienced Maya user, I would appreciate your response even more. If the easy and quick solution already exists in Maya 2018, I would hate the expenditure of an additional specialized software(also, how time-consuming would it be if if you have about 1000 words of dialog).
       
       
       
       
    • By MarcusAseth
       
      I start by saying that I am aware that what I am trying to do can easily be achieved trough the <functional> part of the library or trough a Functor or a Lambda, but I wanted to see this in template form.(Code below)
      So the first function works, the find_if algorithm find the first value in a vector greater than the specified parameter, but there is no template argument deduction for that function call because the algorithm require a pointer to a function but at that time it is not know I will pass an int into it, and so I need to specify like this:
      LargerThan_NoDeduction<30,int> But this seems ugly because now I have to take care of match the two, like <31.2, double>, and the worst part is that if I now decide to pass something else, like a <'d',char> or a <-10,float> , the function expects a size_t as first template parameter, so this won't do.
      So what I wanted to achieve was to pass a predicate to an algorithm in the form of
      LargerThan(30) where the template part of it takes care both of storing the data value (in this case 30, but could be a 'c') and deducing the type we compare from out of it, so in this case int.
      So I have a function LargerThan(Type) that returns a function pointer and passes down the value to an helper function which takes both the value and the deduced type, so I don't have to type them myself.
      Problem is, this helper function still has an auto in the first template parameter, and the compiler doesn't like this
      How would you make this work trough template magics?
      #include <iostream> #include <vector> #include <algorithm> using namespace std; //// template<size_t TestCase,typename Type> bool LargerThan_NoDeduction(Type value) { return value > TestCase; } //// template<typename Type> auto LargerThan(Type TestCase)-> bool(*)(Type) { return LargerThan_helper<TestCase, Type>; } template<auto TestCase, typename Type> //auto here is not liked!!! bool LargerThan_helper(Type value) { return value > TestCase; } //// int main() { vector<int> vec{ 0,11,21,35,67,102 }; //Must specify size and type. auto p = find_if(vec.begin(), vec.end(), LargerThan_NoDeduction<30,int>);//WORKS if (p != vec.end()) { cout << *p << endl; }//WORKS //Deduces type from the value passed. p = find_if(vec.begin(), vec.end(), LargerThan(30));//ERROR if (p != vec.end()) { cout << *p << endl; }//ERROR return 0; }  
    • By markshaw001
      hi i am new to opengl can someone here tell me how to render this 2d shape i want to make curve or arc in 2d in between two lines its a rough sketch i made i tried the internet but could not find much help beacuse their were no specific tutorials of making arcs or curves

  • Popular Now