• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By lubbe75
      As far as I understand there is no real random or noise function in HLSL. 
      I have a big water polygon, and I'd like to fake water wave normals in my pixel shader. I know it's not efficient and the standard way is really to use a pre-calculated noise texture, but anyway...
      Does anyone have any quick and dirty HLSL shader code that fakes water normals, and that doesn't look too repetitious? 
    • By turanszkij
      Hi,
      I finally managed to get the DX11 emulating Vulkan device working but everything is flipped vertically now because Vulkan has a different clipping space. What are the best practices out there to keep these implementation consistent? I tried using a vertically flipped viewport, and while it works on Nvidia 1050, the Vulkan debug layer is throwing error messages that this is not supported in the spec so it might not work on others. There is also the possibility to flip the clip scpace position Y coordinate before writing out with vertex shader, but that requires changing and recompiling every shader. I could also bake it into the camera projection matrices, though I want to avoid that because then I need to track down for the whole engine where I upload matrices... Any chance of an easy extension or something? If not, I will probably go with changing the vertex shaders.
    • By NikiTo
      Some people say "discard" has not a positive effect on optimization. Other people say it will at least spare the fetches of textures.
       
      if (color.A < 0.1f) { //discard; clip(-1); } // tons of reads of textures following here // and loops too
      Some people say that "discard" will only mask out the output of the pixel shader, while still evaluates all the statements after the "discard" instruction.

      MSN>
      discard: Do not output the result of the current pixel.
      clip: Discards the current pixel..
      <MSN

      As usual it is unclear, but it suggests that "clip" could discard the whole pixel(maybe stopping execution too)

      I think, that at least, because of termal and energy consuming reasons, GPU should not evaluate the statements after "discard", but some people on internet say that GPU computes the statements anyways. What I am more worried about, are the texture fetches after discard/clip.

      (what if after discard, I have an expensive branch decision that makes the approved cheap branch neighbor pixels stall for nothing? this is crazy)
    • By NikiTo
      I have a problem. My shaders are huge, in the meaning that they have lot of code inside. Many of my pixels should be completely discarded. I could use in the very beginning of the shader a comparison and discard, But as far as I understand, discard statement does not save workload at all, as it has to stale until the long huge neighbor shaders complete.
      Initially I wanted to use stencil to discard pixels before the execution flow enters the shader. Even before the GPU distributes/allocates resources for this shader, avoiding stale of pixel shaders execution flow, because initially I assumed that Depth/Stencil discards pixels before the pixel shader, but I see now that it happens inside the very last Output Merger state. It seems extremely inefficient to render that way a little mirror in a scene with big viewport. Why they've put the stencil test in the output merger anyway? Handling of Stencil is so limited compared to other resources. Does people use Stencil functionality at all for games, or they prefer discard/clip?

      Will GPU stale the pixel if I issue a discard in the very beginning of the pixel shader, or GPU will already start using the freed up resources to render another pixel?!?!



       
    • By Axiverse
      I'm wondering when upload buffers are copied into the GPU. Basically I want to pool buffers and want to know when I can reuse and write new data into the buffers.
  • Advertisement
  • Advertisement
Sign in to follow this  

DX12 High precision texture.

This topic is 497 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello!

 

I´ve been strugling for a while on how to load hight precision textures in DX12. Right now I´ve been using to load my textures "DirectX::CreateDDSTextureFromFile12". There is also the "TextureFromMemory" function but it only accepts 8bits data and I would like to upload a 16bit texture (to be used as a heightmap).

What could I do? Should I try to upload a buffer instead?

Share this post


Link to post
Share on other sites
Advertisement

Which library are you using that has the DDS loading functions? 

 

Most DDS loaders will use the format of the data of the DDS file when creating the texture. So if the DDS file contains 16-bit data, the resulting texture will use a 16-bit format. Does your heightmap DDS file have 16-bit data in it?

Share this post


Link to post
Share on other sites

Which library are you using that has the DDS loading functions? 

 

Most DDS loaders will use the format of the data of the DDS file when creating the texture. So if the DDS file contains 16-bit data, the resulting texture will use a 16-bit format. Does your heightmap DDS file have 16-bit data in it?

 

Hello!

I´m using the DDSTextureLoader (microsoft), I export the heightmap from world machine as a png, then I import it in Photoshop an export it as a DDS texture (using the Nvidia plugin) if I set the DDS to have 8bits it works but with 16  it doesnt, when I load the texture in the program, the function just crash.

Edited by piluve

Share this post


Link to post
Share on other sites

You're talking about the the one from DirectXTex that you can find here? It doesn't have a function called "CreateDDSTextureFromFile12". Perhaps you're using an older version? If so, it would be worth trying the latest to see if it still crashes. If it does still crash then I would share some details of the crash (where in the code is it crashing? What kind of exception is happening? etc.) so that we can help you debug it.

Share this post


Link to post
Share on other sites

Just an aside, if you are compressing your heightmaps to save on texture memory( bandwidth included ), then you may run into issues where compression artifacts may cause visual anomalies when you render the terrain using the texture samples. If you are not compressing the texture and just using the DDS format, then why not just use the PNG, since neither format is going to save you anything.

Share this post


Link to post
Share on other sites

If you are not compressing the texture and just using the DDS format, then why not just use the PNG, since neither format is going to save you anything.

 

Well, DDS is going to save you loading time..

Share this post


Link to post
Share on other sites

You're talking about the the one from DirectXTex that you can find here? It doesn't have a function called "CreateDDSTextureFromFile12". Perhaps you're using an older version? If so, it would be worth trying the latest to see if it still crashes. If it does still crash then I would share some details of the crash (where in the code is it crashing? What kind of exception is happening? etc.) so that we can help you debug it.

 

I guess I was using an older version of it. I found a way to make it work, I export the texture as a one channel floating point texture and it works (while giving a lot of precision).

 

See you!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement