Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 08 Sep 2011
Offline Last Active Today, 03:18 AM

Posts I've Made

In Topic: How to use a big constant buffer in DirectX 11.1?

11 September 2016 - 04:55 AM

The only thing that changes is how you bind it (with *SSetConstantBuffers1 instead of *SSetConstantBuffers). See https://msdn.microsoft.com/en-us/library/windows/desktop/hh404649(v=vs.85).aspx

In Topic: How do I detect the available video memory?

07 September 2016 - 12:18 PM

It fails because the NVIDIA card doesn't have any outputs. It's how those dual GPU setups work, the final output has to go through the integrated card no matter what.
Both NVIDIA and AMD provide ways to force using the dedicated card; the easiest one is adding this to your program:
extern "C"
	__declspec(dllexport) DWORD NvOptimusEnablement = 1;
	__declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;

In Topic: GPU-Based Validation and other D3D12 debug layer improvements

06 September 2016 - 12:26 PM

Never heard of this, sounds useful.

Also, I just tried my D3D12 app on the Anniversary edition and it's reporting a lot of incorrect operations that it missed before. Nice job!

In Topic: Clarifications on sharing a 2d texture between D3D9Ex and D3D11

28 August 2016 - 10:46 AM

What is DXGI_FORMAT_A8G8B8R8_UNORM? As far as I can tell, it does not exist. The D3D11 equivalent of D3DFMT_A8R8G8B8 is DXGI_FORMAT_B8G8R8A8_UNORM.

In Topic: [DX12] Updating a descriptor heap using a command list?

11 August 2016 - 04:40 PM

I bet it's like this because of limitations of some hardware (Intel? NVIDIA?)..
It's a pity because it makes things a lot harder then they should be.