Meltac

Members
  • Content count

    171
  • Joined

  • Last visited

Community Reputation

508 Good

About Meltac

  • Rank
    Member
  1. Many Thanks MaulingMonkey for checking and commenting on my screenshot. Your explanation helped me much understanding what happens and what the others said here before! Actually I was thinking about adding dust particles and/or jittering anyway Also I've quickly tried one dithering algorithm and it does help, however fine tuning it to prevent too much rendering artifacts seems to be tricky. So for now this is ok for me. Once again I've learned a lot Thanks again to everybody!
  2. Thanks, gentlemen. Honestly I am not so sure about that. The rest of the rendered graphics (which does not come from my feather) does not show any banding nor visible dithering. Same with photos being viewed on my monitor, e.g. gradual colors / brightness level in the sky. I can see some banding going on when I zoom in very closely, but none in full screen mode. The banding that I encounter is much stronger and obviously visible, as you can see in the attached screen. There must be another source for this. Convert to what format? Unfortunately it's not my own engine, it's a proprietary game engine (X-Ray), and I only got access to some of its HLSL shaders, so I can't say how the pipeline exactly works. For what I know the gamma is done inside the engine (ie. in the host application) as there is nothing gamma related in the shaders. But I do not know the specifics. I've attached a BMP, will this also help (instead of PNG)? XR_3DA_2017-07-10_20-40-44-01.bmp
  3. Hi all In my post-processing pixel shader (HLSL, SM5) I am facing heavy banding issues as to be seen in the attached image. It doesn't matter what type of effect I am implementing (fake volumetric light in this case), it always happens when I am applying some color blending, e.g. fading out. Could this be a precision issue (I'm working with float numbers), or where do I have to start looking? Any hints will be appreciated!
  4. Alright, I've compared *all* DX / D3D files in SysWOW64 from both my Vista and my W10 installation (using a Diff tool, binary content comparison) and then copied all files that were different from Vista to Win10 (DirectX 9 to 11 files, 12 was not present on Vista). Before I had double-checked that these files are actually being used by the engine by renaming then and getting startup errors accordingly. So now all DirectX files being used should be binary-wise identical on both installations, Vista an W10, no matter if they come from the OS or from the game installation folder. But the behavior is still absolutely the same !?? The compiler still does not behave as in Vista, omitting the same code parts as before!   Any further thoughts !?
  5.   Thanks for your answer. I am using a copy of the compiler DLL inside my bin directory, so compiler-wise the file should be the same (sorry I should mentioned that in the first place). So I suspect it must be another D3D / DX file that makes the difference, perhaps some DLL that the compiler uses. Any ideas how I could figure that out?
  6. Hi all I'm facing some weird problems with some game's HLSL shaders, on Windows 10 exclusively. The shaders are compiled against DX 11 / shader model 5. In case it matters, they still use the old fashioned DX9 methods tex2D etc. for sampling because they initially had been written for DX9 / shader model 3 and later been migrated with least possible effort. There are different variations of pixel shaders that the game engine applies depending on in-game weather (ie. one pixel shader for dry weather, a different one for rain etc.). As those variations share large parts of the same code, those code parts are being referenced as #include directives, and the differences have been implemented as #define and #undef directives (e.g. rain shader defines a part for rendering the ground as wet, dry shader does not). The game engine compiles all source shader files upon start-up, one by one, in a predefined (hard-coded) order, using the usual D3D commands. NOW: On my Vista 64bit environment, the D3D compiler seems to figure out which included shader source files are being used anywhere (ie. in at least one of the pixel shader variants), and compiles all used includes.   On my Windows 10 64bit environment, things behave differently: If the compiler encounters some include not being used in the first compiled shader file, it DOES NOT COMPILE that include file REGARDLESS it being used in subsequent file to compile !!! Pseudo code sample: // Pixel shader for dry weather (= first file being compiled) #include "SomeSettingsFile.h" #include "GenericShaderForAllWeatherTypes.h" // Pixel shader for rainy weather (= second file being compiled) #include "SomeSettingsFile.h" #define IT_RAINS #include "GenericShaderForAllWeatherTypes.h" // GenericShaderForAllWeatherTypes.h [...] #ifdef IT_RAINS #include "RainShader.h" render_things_as_wet(); #endif [...] On Vista this works as expected: Source file "RainShader.h" is always being compiled an thus available on runtime. On Windows 10 is does not: "RainShader.h" and the code segment starting with #ifdef IT_RAINS will be omitted / "optimized away" by the compiler because the first shader being compiled does not use it!!! Any ideas???
  7. Thanks so far guys.   I've read that some guys used derivatives (e.g. the third derivative) to detect blur / sharpness.   Anyone knows how to implement a derivative / derivation in HLSL?
  8. Hi all,   I'm looking for a way to distinguish sharp areas in an image from those that exceed a certain amount of blur / "unsharpness". In HLSL.   More specifically, I want to detect / mask / mark the entire out-of-focus part of a photograph, or, vice-versa, its in-focus part.   So far I've tested several approaches dealing either with edge detection, or the overall-sharpness of images, but they all fail in properly and robustly masking areas with certain blur or sharpness.   I couldn't imagine that this would be such a hard task since the human eye can distinguish in-focus (sharp) from out-of-focus (blurred) areas of a photo quite easily.   Any ideas or hints?
  9. Ok thanks - I think I'll just try myself then
  10.   Related question, are there any known compiling performance gains (or disadvantages) when using the newer version of the compiler? Or in other words, in case that several of those versions do work as expected, which would be the fastest (in terms of compilation times)?
  11. D3DCOMPILER_47.dll ?

      Thanks, I've just tested it. No difference for non-compute shaders (at least in my case), unfortunately.
  12. D3DCOMPILER_47.dll ?

    On a related topic, does bring D3DCOMPILER_47.dll any performance or memory benefits over the _43 version?
  13. Hey guys, just wanted to let those who tried to help me know that the issue has been solved - my code was right, it was the engine (host application) delivering wrong matrices under certain conditions.   So everything's fine now, thanks again for your help :)
  14. Hash! I need a hash, simple as it is! I should have come to that conclusion myself already in the first place     I think I have explained myself well enough to make clear why I reacted the way I did. It wasn't just some "other suggestion" that made me mad but the fact that some people insist on such a "other suggestion" even after I have stated clearly that it is not an option in my case.   Nonetheless, your hint on goniometric functions, even though not exactly what I was looking for, has lead me to the solution - I simple and stupid hash.   Thanks again.