Jump to content

  • Log In with Google      Sign In   
  • Create Account

Jason Z

Member Since 04 Feb 2004
Offline Last Active May 14 2016 05:17 AM

#5289403 Use Buffer or Texture, PS or CS for GPU Image Processing?

Posted by Jason Z on 30 April 2016 - 05:20 AM

I haven't done a direct comparison myself, but you have already stated that it depends on the filter size.  You also mentioned that the PS has access to some texture filtering instructions that aren't available to the CS - but will you make use of filtering operations?  It sounds like you already know quite a bit about the difference between the two shaders, so you just need to apply that to your specific needs and see which one is needed.


By the way, there is a separable bilateral filter implementation available in my Hieroglyph 3 engine in case you want to start out there.  I would be interested to hear what choice you make on this topic!

#5286403 DirectX 11 Volume Rendering Advice Needed

Posted by Jason Z on 11 April 2016 - 07:14 PM

For #1, you are clipping the geometry that goes behind the near clipping plane of the view frustrum.  If you want to keep that from happening, you can modify your vertex shader to transform your vertices such that their Z component is set to 0 if it has a negative value after the transformation has been applied.  This will make the vertices get pushed back by your camera, and should keep your cube from being cut.


I don't really understand the issue in #2 - can you clarify that a bit more? 

#5284355 Hololens Development Tools

Posted by Jason Z on 30 March 2016 - 04:26 PM

In case you didn't catch the live stream, the Hololens development tools (including the emulator) are now available: https://www.microsoft.com/microsoft-hololens/en-us/developers


I'm in the process of installing everything, but it would be interesting to hear any impressions you guys have on the tools!

#5277282 How to enable supersampling in DirectX 11?

Posted by Jason Z on 21 February 2016 - 07:29 AM

@Steven: You have to have a multi-sampled resource to render into.  Once you have that, the actual rendering may or may not take advantage of the system value semantic that you mentioned (its up to you to decide if you need to access subsamples, or if you just let the rasterizer take care of that for you).  After all rendering has been done, you then have to resolve your multisampled resource to a normal one for presentation to a window.

#5260383 d3d12: which debug tools to use?

Posted by Jason Z on 03 November 2015 - 03:44 PM

Have you checked out the DXCAP tool?  That seems to be an extremely efficient way to capture log files, which you can then later debug.  A general information can be found here:




#5246677 UpdateSubresource on StructuredBuffer

Posted by Jason Z on 15 August 2015 - 06:48 AM

To follow on MJP's great advice, have you tried using the performance tools in the latest versions of Visual Studio?  They can show you a pretty good representation of the parallelism between the CPU and GPU, and will likely show you some insight into what is costing you time that stacks up in your overall frame time.

#5246674 handling DepthStencil / Blend / Rasterizer State dilemma...

Posted by Jason Z on 15 August 2015 - 06:40 AM

That is more or less correct.  I have a structure that holds the references to the states (RenderEffect), and a material can reference a number of different RenderEffects for different situations.  The higher level rendering pass is actually controlled in a separate object called a SceneRenderTask.  This object is the one that sets up the pipeline outputs (i.e. render and depth targets) and provides whatever special logic is needed for that particular rendering pass.  In your example of a mirror, the stencil rendering would be done in one pass and the reflected scene would be a second SceneRenderTask. 


If you are interested in seeing how it works more closely, the whole engine is available as open source: Hieroglyph 3

#5245114 handling DepthStencil / Blend / Rasterizer State dilemma...

Posted by Jason Z on 08 August 2015 - 10:09 AM

I use an object that represents the rendering state the drawing to be done.  This is contained within a material object, and each individual object in the scene can reference a material (via a smart pointer).  This lets you have a shared material when it makes sense, or you can just as easily duplicate a material for a special object that wants to mutate its material state.


In the methods you show above, you are limiting yourself to a fixed number of states - if you make them their own objects then you can have an unlimited number of states to use.

#5243093 Is there a reason why you cannot see the values of 1- and 2-letter variables...

Posted by Jason Z on 27 July 2015 - 07:12 PM

Which version of Visual Studio were you using, and on which operating system?  Also, was there a particular type of variable that had the issue (i.e. local, a certain type, input/output registers, etc...)?  If you post a shader, we could try it out to see if it occurs the same on our end.

#5232419 Values for clearing geometry pass renderTargets

Posted by Jason Z on 02 June 2015 - 12:00 PM

If you think about it, this is a similar problem that you have to face with a traditional depth buffer.  You clear it to a particular value, which usually represents depths at the far clipping plane.  If you don't render any geometry over a portion of the scene, then that portion will keep your default value even though no geometry is actually present in that location.  Your G-Buffer is more complex, but retains the same idea.  You are trying to make an area not updated have a defaulted appearance.


My suggestion would be to either ensure that the entire render target is rendered to every frame (i.e. a skybox to fill in the gaps) or just to default to an appropriate value to give the basic appearance that you are looking for.  In your test scene shown above, you could for example just create a huge inside-out cube that your demo runs inside of.  This would let you control the appearance of the entire render target, and you could use a default value for your render targets that indicates some error.  Something like clearing the normal buffer to all zeros or very large values.  This will give you a visual clue that some area of the scene didn't get rendered to and let you quickly figure out why!

#5225213 Very strange objects transormation issue

Posted by Jason Z on 24 April 2015 - 07:24 AM

Tried to debug my engine with VS graphics debugger and see that debugger`s frametime is the same as in my engine.


Just so I am sure that I understand, do you see the stuttering still in this frame time still?


Tried with nvidia GT 530 with vsync on. Stuttering has gone, but only with vsync = on. SO, I think this is likely related to some hardware problems.


This would not indicate to me that there is a hardware problem.  Vsync essentially just forces your application to present at a fixed time interval, which is *precisely* what Buckeye suggested above.  So there is likely something in your application that is causing a variable frame time, and you have to track it down.  Start by isolating part of your application.  Make a list of the things that are done every frame, and start commenting them out while taking measurements from the VS graphics debugger to see when it becomes smooth.


Once you have identified what the source of the stuttering is, then you can move on determine if this is something that can be fixed, or if it is inherently something that must be present.  In the worst case, you can simply enable Vsync and forget about the issue, but this is a great opportunity to learn more about how your application is working - take advantage of it!

#5224940 Very strange objects transormation issue

Posted by Jason Z on 22 April 2015 - 06:14 PM

What, for example?

Could you please explain why this is happenning actually?

I have no idea, because I don't know anything about what you are doing.  Did you check the frame time with a separate tool as I mentioned above?  If so, what were the results?  The nature of the changes in frame time will probably provide clues for you to figure out what the issue is.  Trying to guess about the cause will not help - you need to take a factual set of measurements and logically come to a conclusion about what is affecting your program.

#5224778 Very strange objects transormation issue

Posted by Jason Z on 21 April 2015 - 04:55 PM

Have you tried using a separate tool to identify if your application is the cause of the jerkiness?  You can use the performance analysis tools in VS2013 or VS2015 to see if your frame time is actually changing or not.  If it isn't, then the issue is probably in your time calculation like Buckeye has been describing.  If the frame time is changing and that is causing the stuttering, then you have something else to track down.

#5221342 Visual Novel Mechanics in Unreal Engine

Posted by Jason Z on 04 April 2015 - 10:25 AM

The description that you gave sounds perfectly fine, so why not just implement it and see how it goes?  Especially when using a pre-built engine, it should be pretty quick to get the thing up and running and see what works well and what doesn't.  Then you will have a little experience building that type of application, and you will figure out what needs to be improved and what doesn't.

#5220349 Direct3D 12 documentation is now public

Posted by Jason Z on 30 March 2015 - 06:19 PM

If my guess is right, a PC game with broad HW support would either use DX12 plus DX11 for old windows, or would use Vulkan plus GL3 for old hardware... Which kinda sucks.

I actually think that there will be lots of mixes between D3D11 and D3D12.  Even with all the extra power in D3D12, if you don't need cutting edge performance then D3D11 is going to be easier to use (plus there is lots of existing code bases out there already...).  Over time I'm sure this will shift more and more towards D3D12, but I think it will take longer than most people are thinking right now...