dx11 adapter with emtpy output

Started by
1 comment, last by TRUEPADDii 11 years, 2 months ago

Hello gamedev.net,

I was struggling against a Problem with the EnumAdapter1 function of DXGI.
What I've supposed to do was to switch from my onboard chip to my nvidia geforce gt550m (laptop),
because my onboard chip is DX10.1 and it runs just fine, when reverting from DX11.


printf("Adapter: %d (%s)\n", adapterCount, m_videoCardDescription);
 



Shows my nvidia chip, but when trying to use this card later, it can't find Outputs:


if(FAILED(adapter->EnumOutputs(OutputCount, &adapterOutput)))
    break;
 

Here is a snippet:


[...]

int adapterChoosen = 0;
	if(adapterCount > 1) {
		printf("Please select the adapter: ");
		scanf("%d", &adapterChoosen);
	}
	else if(adapterCount == 1) {
		adapterChoosen = 0;
	}
	else {
		log(LOG_ERROR, "No suitable adapter found!");
		return false;
	}

	if(FAILED(factory->EnumAdapters1(adapterChoosen, &adapter))) {
		log(LOG_ERROR, "D3DClass::Initalize IDXGIFactory::EnumAdapters failed");
		return false;
	}

	int OutputCount; 
	for(OutputCount = 0; ; OutputCount++) { //adapter->EnumOutputs(OutputIndex, &adapterOutput) == S_OK)
		//printf("Output found: %d\n", OutputIndex);
		if(FAILED(adapter->EnumOutputs(OutputCount, &adapterOutput)))
			break;

		adapterOutput->Release();
		adapterOutput = 0;
	}

[...]

My chip can handle DX11 as I can start other DX11 applications, which do work.
- Regards

PS: sorry for bad english

Advertisement

Are you able to see your output when you query for the onboard graphics? If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter? I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

Are you able to see your output when you query for the onboard graphics? If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter? I haven't done much with multi-adapter/multi-output setups, so that is just a guess...


I noticed, when printing the adapters to console, it looks like this:


Searching for adapters...
Adapter: 0 (Intel(R) HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)
Please select the adapter:

When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel(R) HD Graphics ...)" to the nvidia chip.

The console has changed:



Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)
Please select the adapter:

On "adapter 0" it chooses my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

This topic is closed to new replies.

Advertisement