• Advertisement
Sign in to follow this  

DX11 Reading/writing 16 bit unsigned integers from/to buffer

Recommended Posts

Hi guys,

I would like to double-check with you if I am correct.

 

I have a buffer of unsigned integers of format DXGI_FORMAT_R16_UINT, which I would like to read from the shader.

Since there is no dedicated HLSL 16 bit unsigned int type, I guess we should go with HLSL uint (32 bit) type.

Buffer<uint> g_InputBuffer : register(t0); // DXGI_FORMAT_R16_UINT

In this case, will each element of the input buffer be automatically converted into a corresponding uint (32 bit) element?

 

And vice versa, if I want to output to buffer of type DXGI_FORMAT_R16_UINT, I guess HLSL uint will be automatically converted to 16 bit unsigned?

RWBuffer<uint> g_OutputBuffer : register(u0); // DXGI_FORMAT_R16_UINT

uint value = ...

g_OutputBuffer[outIndex] = value;

 

Thanks!

Edited by _void_

Share this post


Link to post
Share on other sites
Advertisement

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By MHG OstryTM
      Hello guys,
      I've released my game for the first time. I'm very excited about it and I hope you'll enjoy the game - Beer Ranger. It's a retro-like puzzle-platfromer which makes you think a lot or die trying. You have a squad of skilled dwarfs with special powers and your goal is tasty beer. There is a lot of traps as well as many solutions how to endure them - it is up to your choice how to complete the level! 
      Link to the project: Project site
      Link to the Steam site with video: Beer Ranger
      Have fun and please write feedback if you feel up to.
      Some screens: 




    • By matt77hias
      Is it common to have more than one ID3D11Device and/or associated immediate ID3D11DeviceContext?
      If I am correct a single display subsystem (GPU, video memory, etc.) is completely determined (from a 3D rendering perspective) by a
      IDXGIAdapter (meta functionality facade); ID3D11Device (resource creation facade); ID3D11DeviceContext (pipeline facade). So given that you want to use multiple display subsystems, you will have to handle multiple of these interfaces. A concrete example would be a graphics card dedicated to rendering and a separate graphics card dedicated to computation, or combining an integrated and dedicated graphics card. All such cases seem to me quite far fetched to justify support in a majority of games. So moving one abstraction level further downstream, should a game engine even consider multiple display systems (i.e. there is just one ID3D11Device and one immediate ID3D11DeviceContext)?
    • By katastrophic88
       

      Hello Everyone!
      I’m now working with Zugalu -- developer of an upcoming SHMUP named Technolites – as Community Manager.
      In Technolites, you’ll take command of a fully customizable ship and dash across the universe to defeat an ancient alien threat. Deploy more than 500 weapons and utility upgrades – fighting by yourself or with a friend. The fate of humanity rests in your hands. We’re currently live on Kickstarter, and we’d love it if you could check out our page! Any feedback is greatly appreciated
      https://www.kickstarter.com/projects/643340596/technolites-a-story-driven-shoot-em-up-by-zugalu
      Looking forward to seeing all of the great game projects being developed and launching this year!
    • By Nimmagadda Subba Rao
      Hi,
         I am a CAM developer working with C++ and C# for the past 5 years. I started working on DirectX from past 6 months. I developed a touch screen control viewer using Direct2D. I am working on 3D viewer currently. I am very slow with working on Direct3D. I want to be a gaming developer. As i am new to this i want to know what are the possibilities to explore in this area. How to start developing gaming engines? Is it through tutorials? I heard suggestions from my friends that going for an MS helps. I am not sure on which path to choose. Is it better to go for higher studies and start exploring? I am currently working in India. I want to go to Canada and settle there. Are there any good universities there to learn about graphics programming? Sorry if I am asking too many questions but i want to know the options to choose to get ahead. 
    • By pcmaster
      Hi all, I have another "niche" architecture error
      On our building servers, we're using head-less machines on which we're running DX11 WARP in a console session, that is D3D_DRIVER_TYPE_WARP plus D3D_FEATURE_LEVEL_11_0. It's Windows 7 or Windows Server 2008 R2 with "Platform Update for Windows 7". Everything's been fine, it's running all kinds of complex rendering, compute shaders, UAVs, everything fine and even fast.
      The problem: Writes to a cubemap array specific slice and specific mipmap using PS+UAV seem to be dropped.
      Do note that with D3D_DRIVER_TYPE_HARDWARE it works correctly; I can reproduce the bug on any normal workstation (also Windows 7 x64) with D3D_DRIVER_TYPE_WARP.
      The shader in question is a simple average 4->1 mipmapping PS, which samples a source SRV texture and writes into a UAV like this:
       
      RWTexture2DArray<float4> array2d; array2d[int3(xy, arrayIdx)] = avg_float4_value; The output merger is set to do no RT writes, the only output is via that one UAV.
      Note again that with a normal HW driver (GeForce) it works right, but with WARP it doesn't.
      Any ideas how I could debug this, to be sure it's really WARP causing this? Do you think RenderDoc will capture also a WARP application (using their StartFrameCapture/EndFrameCapture API of course, since the there's no window nor swap chain)? EDIT: RenderDoc does make a capture even with WARP, wow
      Thanks!
  • Advertisement