OpaqueEncounter

Members
  • Content count

    19
  • Joined

  • Last visited

Community Reputation

155 Neutral

About OpaqueEncounter

  • Rank
    Member
  1.   So that's essentially an animated spritesheet on a quad? That seems plausible, but also potentially expensive to render. Some of those animations are pretty detailed and long.   In any case, I am going to give this a shot.
  2. Question's simple, how (what techniques or combination of techniques) are the elongated beams of light (pointed out by red arrows) in the screenshot from DOTA 2 achieved? The fire light (green arrow) is obvious, that's just a bunch of particles, but what about those dynamic (they twist and turn) beams of lights.
  3. SpriteBatch billboards in a 3D slow on mobile device

      EDIT: Well, I actually did try running GenerateMipMaps every frame and the framerate did go up, so that's that. :)
  4. SpriteBatch billboards in a 3D slow on mobile device

      I actually generate the texture like I described above (render models into a render target). I'll play around with lower quality pixel format, but I guess if there are no other suggestions then I'm stuck with it.   The only thing I don't understand is, why is it that reducing the render target size helps if this is a fillrate issue? Or is fillrate a bit more broad than I assume it to be? (Sampling a larger image contributes as well?)
  5. SpriteBatch billboards in a 3D slow on mobile device

        Not vertex processing for sure since the aforementioned method does all that on the CPU. That, I managed to measure to ensure that it's not a bottleneck. And yes, reducing what is being drawn on screen increases the framerate.     The shader used in that method is BasicEffect, in which I disabled absolutely everything (even vertex color). I am already running at the lowest resolution feasible.   To render less pixels, I also tried to replace BasicEffect with AlphaTestEffect.   It seems that if this is a fillrate issue, the only thing really left is to skip drawing some of those billboards. Luckily, it happens to be that quite a few of them are blocked most of the time. I am not really sure where to start here if this is a solution. Frustum culling is not really the answer here and occlusion querying is unavailable on CPUs like Adreno 225 and less, which I plan on targeting.   Any suggestions?
  6. I used this method http://blogs.msdn.com/b/shawnhar/archive/2011/01/12/spritebatch-billboards-in-a-3d-world.aspx to create a 3D billboard renderer using SpriteBatch. It works perfectly as described and on a modest desktop (with Intel HD graphics) can renderer 10,000s of billboards or particles easily.   On a mobile device (Windows Phone) the framerate drops sharply past a certain, not so large, point. My test (on all devices) is this:   - Render a primitives (sphere, cube, etc) into a render target - Pass the render target to the method above - Start increasing the number of billboards until the framerate drops.   On an x86 desktop or an ARM tablet (Surface) the framerate holds into the thousands. On the phone, it instantly drops from 60 to 30 (looks like disabling VSync has no effect on that device) as soon as you pass a certain point (~200?). The funny thing is that I can get the framerate to go up to 60 again by making the billboards half the size. Same goes when making the render target half the size.   Using a stopwatch, I determined that the time spent on the CPU is nowhere near the 16.67ms threshold. VS2013's frame analysis is unavailable on the Windows Phone, so that's useless.   Can anyone explain as to what is going on here? Is this simply the limitation of a low-power GPU (the Adreno 225 in this case)? If so, what exactly is bogging it down? The fill rate? The blending? (I tried all blend states from Opaque to NonPremultiplied, no effect on performance).
  7. HLSL Shader Library

      I guess I should have clarified that I was talking specifically about visual techniques and post processing. The Nvidia SDK samples appear to be what I was looking for.
  8. HLSL Shader Library

    Funny thing is that right after posting this, I stumbled upon the NVidia SDK samples. DX11 even has an online listing.
  9. HLSL Shader Library

    Is there any other hlsl shader library aside from nvidia's (http://developer.download.nvidia.com/shaderlibrary/webpages/hlsl_shaders.html) which hasn't been updated in years and seems to have been completely abandoned.   Isn't there anything out there that maintains the latest in hlsl shaders?
  10. MSAA in WinRT

    I am not 100% sure, but I did get a sense of that. One of MonoGame's issues talks about that, https://github.com/mono/MonoGame/issues/358.   The question that I have, is how to actually enable MSAA (even if it is just for x86). Setting number of samples in either DirectX WinRT API or SharpDX doesn't seem to do anything at all (throws an Exception in SharpDX by design).  
  11. MSAA in WinRT

    There's no MSAA in WinRT. Has anyone found a workaround? I tried rudimentary supersampling (draw to larger buffer, then draw a scaled down quad), but this is unrealistic for production.   What are the options?
  12. [quote name='RulerOfNothing' timestamp='1344729723' post='4968568'] I believe that it is because when the elevation reaches 90/-90 degrees the camera is pointing in the same direction as the up vector, so presumably the view transformation doesn't work (At 90 degrees, the up vector you use has a projection onto the viewport of zero length). What you could do is set the up vector to always be perpendicular to the position vector. [/quote] I transformed the position vector by +90 degrees about the X-axis and that seems to have fixed the problem. However, the camera view is now not what I expect it to be. The camera elevation works fine, but the orientation behaviour is completely erratic.
  13. I have an arc camera that behaves like this: [source lang="csharp"] Matrix translation = Matrix.Translation(-Vector3.UnitZ * cameraDistance); Matrix elevation = Matrix.RotationX(MathUtil.DegreesToRadians(CameraElevation)); Matrix orientation = Matrix.RotationY(MathUtil.DegreesToRadians(CameraOrientation)); Matrix translation3D = translation * elevation * orientation; Position = translation3D.TranslationVector; View = Matrix.LookAtLH(Position, Vector3.Zero, Vector3.UnitY); [/source] It works like a charm, minus one issue. If I set the maximum/minimum elevation to 90/-90 degrees respectively, then once the camera elevation reaches that limit, whatever I am viewing, disappears. The "hackish" solution is to set the bounds plus/minus some very small value (0.0001f). However, I would still like to know what this is happening, and if there is a solution that doesn't involve adding/substracting magic numbers.
  14. Calculating Normals, Binormals, Tangents

    scyfris, I got that code from somebody's sample code. I rewrote my normal generation code to generate Vertex Normals (if that's the correct term) instead of Surface Normals and will rewrite to generate Tangents and Binormals with what you gave me above. Thanks.