Jump to content
  • Advertisement

auto.magician

Member
  • Content count

    29
  • Joined

  • Last visited

Community Reputation

128 Neutral

About auto.magician

  • Rank
    Member
  1. auto.magician

    DX11 IDXGIAdapter.CheckInterfaceSupport bug

    Hi, yes, that was the first thing I thought of, but there's only 1 card though and only 1 adapter returned. There's no internal motherboard gpu or cpu/gpu combination that may create an extra entry for an adapter.
  2. auto.magician

    Shader-sematics

    Hiya, Im not sure I'd be much help here as I'm throwing this out from memory but.... From what I remember the 2 layouts, your c++ and shader should match exactly. However your c++ is setup with :- layout[0].Format = DXGI_FORMAT_R32G32B32_FLOAT; which is a 3 component input. and the shader is expecting float4 position : POSITION; a 4 component input. However I cant remember if this makes a difference or not. The input semantic index is separate from the semantic name in the c++ input-layout. So you dont use "COLOR1" as a semantic name c++ side. For COLOR1:- layout[4].SemanticName = "COLOR"; layout[4].SemanticIndex = 1; layout[4].Format = DXGI_FORMAT_R32G32B32A32_FLOAT; layout[4].InputSlot = 0; layout[4].AlignedByteOffset = D3D11_APPEND_ALIGNED_ELEMENT; layout[4].InputSlotClass = D3D11_INPUT_PER_VERTEX_DATA; layout[4].InstanceDataStepRate = 0; then you should be able to use COLOR1 in the shader. Using PIX will show you the exact layout and the data thats being passed through each stage of the render pipeline, including shader inputs and variables. So I'd start there to see what's going on gpu side. SV_TARGET is a semantic for the currently set render target. You can append an index value to direct pixel output to the corresponding render target. Of course the render targets need to be valid and have been created on the c++ side first. It can be used for writing to multiple render targets in one pass, used typically with but not limited to deferred renderers.
  3. Hiya, I had a problem with IDXGIAdapter.CheckInterfaceSupport returning that a Radeon Dx11 gpu didn't support the Dx11 interface, but was ok with 10.1. However Dx11 was available through using D3D11CreateDevice, so after a quick work-around, I change my code to support that command. After I had my code working correctly and reporting as expected I then scouted the internet, only to find references to this issue with Dx10 devices which were back in 2009/2010. Does anyone know if this bug has supposed to have been fixed just yet? I would ask over on the MSDN but well.... not meaning to sound rude but it seems a waste of time asking anything there lately. Many thanks in advance. Dave.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!