thmfrnk

Members
  • Content count

    42
  • Joined

  • Last visited

Community Reputation

167 Neutral

About thmfrnk

  • Rank
    Member

Personal Information

  • Interests
    Programming
  1. Yea in the HLSL I posted I forgot the : register(u1).. so I was aware about the indices. In my case the shading of each particle is very complex (lots of lighting) so I was hoping this could be an easy way of reducing the calculation for already covered particles. Finally he also did something equal I think:
  2. 1. Yes of course Debug Layer is on.. No Warnings or Errors. 2. I am using SlimDX so I don't call that method nativly, but it looks like this: Dim UAVs() As UnorderedAccessView = {Overdraw.UAV} Dim RTVs() As RenderTargetView = {Path.HDR_Buffer.RTV} C.OutputMerger.SetTargets(View.DepthStencilView, 1, UAVs, RTVs) About my idea: I thought about to have a "coverage texture" parallel to the RenderTarget to check how much alpha was already drawn to the current fragment in order to discard any further draws if a specific value is reached. In order to simply test if a UAV can be written and read in PS i just tried something like that: RWTexture2D<uint> Overdraw; // only R32_UINT are supported.. float4 main(PixelShaderInput input, float4 coord : SV_POSITION) : SV_TARGET { ... uint2 uv = (uint2) coord.xy; if (Overdraw[uv] > 0) discard; ... Overdraw[uv] = 3; ... } But nothing gets discarded. About "unordered".. yes you are right, but I thought also the execution order for all fragements in PS is also "unordered"
  3. Yes sure you can't use an SRV of the same ressource where you have bound the RTV.. but in my case I have a second texture with an UAV which is bound parallel to the main RTV. Using an UAV it should be possible to read/write a texture within the PS. I am binding the UAV togehter with the current RTV and DSV using ID3D11DeviceContext::OMSetRenderTargetsAndUnorderedAccessViews and i don't get any error message..
  4. Hey, I just had the idea to use a RWTexture2D within the pixel shader of my particles to try to reduce overdraw/fillrate. I have created a R32_Float texture with UAV and bound it together with the RenderTarget. In my pixel shader I just add a contstant value to pixel of the current fragment while I am checking for a maximum at the beginning. However it does not work. It seams that the texture is not getting written. What I am doing wrong? Or is it not possible to read/write at the same time in PixelShader? Thx, Thomas
  5. DX11 Tiled Shading - Cone Culling

    Hey, not sure how to call it but I think its called "Infinite Cone"? Yes there should be still cases where this will fail, like when looking mostly in same direction as the cone, so you wouln't get the roundings on the end but in my case this would'nt be a problem. BTW: like your blog a lot! nice work!
  6. DX11 Tiled Shading - Cone Culling

    Yes you are right, a sphere-test does not make sense in that case. You are sure about the 2D approach? Finally any cone I would draw in 3D will appear as a triangle on screen (apart from the cam-in-cone situation).. ? So if I could calculate that Triangle, it should be only a tri<>rect or tri<>circle test
  7. DX11 Tiled Shading - Cone Culling

    While I am using Tiled Based Shading I don't have any tile slices because its only 2D.. Sure in case of clustered shading it would make sense to place a sphere in a cell, because the cells are placed in depth too. In my case I only have 2D grid cells where I need to check if a cone will be visible on that tile.. Actually I tried some complex Ray/Cone test where I shot rays from the corners+center of a cell. That works, but because I only check 5 points, there are still cases where the test fails.. and also its not very fast.. I think the solution is easier as it looks.. While my tiles are only 2D also the cone should be also converted in a 2D triangle.. I only need to find a way to test a 2D rectangle/triangle for intersection plus the case when the camera is within the cone..
  8. Hey, I found a very interesting blog post here: https://bartwronski.com/2017/04/13/cull-that-cone/ However, I didn't really got how to use his "TestConeVsSphere" test in 3D (last piece of code on his post). I have the frustumCorners of a 2D Tile cell in ViewSpace and my 3D Cone Origin and Direction, so where to place the "testSphere"? I thought about to also move the Cone into viewspace and put the sphere to the Center of the Cell with the radius of half-cellsize, however what about depth? A sphere does not have inf depth? I am missing anything? Any Ideas? Thx, Thomas
  9. DX11 Temporal Antialising

    Yes I found it here: advances.realtimerendering.com/s2016/ >Temporal Antialising in Uncharted 4 Very cool! The only thing I couln't find was how big these offests should be in the Projection Matrix, in the Slides they've shown to replace Proj[2,0] and [2,1] with offsets. EDIT: Just found the answer here: https://bartwronski.com/2014/03/15/temporal-supersampling-and-antialiasing/ @J thanks again for your help. I'll make test next days and see how it works.
  10. DX11 Temporal Antialising

    If its really only the projection matrix, it would be awesome! Looking forward to your reply
  11. Hello, I am working on a Deferred Shading Engine, which actually uses MSAA for Antialising. Apart from the big G-Buffer ressources its working fine. But the intention of my engine is not only realtime-rendering as also render Screenshots as well as Videos. In that case I've enough time to do everything to get the best results. While using 8x MSAA, some scenes might still flicker.. especially on vegetations. Unfortunately 8x seems to be the maximum on DX11 Hardware, so there is no way to get better results, even if don't prefer realtime. So finally I am looking for a solution, which might offer an unlimited Sample count. The first thing I thought about was to find a way to manually manipulate MSAA Sample locations, in order to be able to render multiple frames with different patterns and combining them. I found out that NVIDIA did something equal with TXAA. However, I only found a solution to use NVAPI, in order to change sample locations. https://mynameismjp.wordpress.com/2015/09/13/programmable-sample-points/ While I am working on .NET and SlimDX I've no idea how hard it would to implement the NVIDIA API and if its possible to use it together with SlimDX. And this approach would be also limited to NV. Does anyone have an idea or maybe a better approach I could use? Thanks, Thomas
  12. HTC Vive

    Hi,   I'm wondering how to use one of the current available VR tools like HTC Vive in own applications. Unfortunately the websites and explanations are not really clear for me. I always found stuff for Unity, Unreal and Steam but is there no simple SDK for to use in own applications? I've done several business realtime 3D applications where I would like to integrate a VR Support. Is there someone could help or already haves expirience with thoose tools?   Thx, Thomas
  13. // Get Picking Ray from Camera Public Function GetPickingRay(sp As Vector2) As Ray Dim v1 As Vector3 Dim v2 As Vector3 Dim vp as Matrix = GetViewProjMatrixFromCamera() Dim Point As Vector3 = New Vector3(sp.X, sp.Y, 0) v1 = Vector3.Unproject(Point, Viewport.X, Viewport.Y, Viewport.Width, Viewport.Height, Viewport.MinZ, Viewport.MaxZ, vp) Point.Z = 1 v2 = Vector3.Unproject(Point, Viewport.X, Viewport.Y, Viewport.Width, Viewport.Height, Viewport.MinZ, Viewport.MaxZ, vp) Dim Loc As Vector3 = v1 Dim Dir As Vector3 = Vector3.Normalize(v2 - v1) Return New Ray(Loc, Dir) End Function ... // Get Ray at Mouse Position Private Function GetTransformedRay(MousePos As Vector2, worldMat As Matrix, Editor As TEditor) As Ray Dim cRay As Ray = Editor.MyView.Camera.GetPickingRay(MousePos) Dim InvMat As Matrix = Matrix.Invert(worldMat) Return New Ray(Vector3.TransformCoordinate(cRay.Position, InvMat), Vector3.TransformNormal(cRay.Direction, InvMat)) End Function Here some Code, how I am doing ray picking an object. Using the GetTransformRay function you will get the Ray in ObjectSpace for the Object you want to test.
  14. Hey,   I am looking for a way to store my Texture-Pool in a more efficient way. Actually I simply use the Texture2D.ToFile option from SlimDX and save it as a DDS. Unfortunatly it does not give an option to save it as a compressed DDS file, so it ends up in very big files. So what would be the fasted way to store it as compressed DDS files? I am using DX11..   THX, Thomas
  15. DX12 .Net DX12

    The biggest Pro for SlimDX by my opinion is, that it works fine with VB.NET. SharpDX does not support VB.NET because they messsed up some case sensitive exports. Another big point are projects which are already done in SlimDX. So @promit, pleeeeaasse continue your great work! :rolleyes: