• Create Account

Posted 14 February 2013 - 07:15 PM

Are you able to see your output when you query for the onboard graphics?  If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter?  I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

I noticed, when printing the adapters to console, it looks like this:

Searching for adapters...
Adapter: 0 (Intel® HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)


When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel® HD Graphics ...)" to the nvidia chip.

The console has changed:

Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)


On "adapter 0" it chooses my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

Posted 14 February 2013 - 07:10 PM

Are you able to see your output when you query for the onboard graphics?  If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter?  I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

I noticed, when printing the adapters to console, it looks like this:

Searching for adapters...
Adapter: 0 (Intel® HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)


When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel® HD Graphics ...)" to the nvidia chip.

The console has changed:

Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)


On "adapter 0" it choose my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

Posted 14 February 2013 - 07:10 PM

Are you able to see your output when you query for the onboard graphics?  If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter?  I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

I noticed, when printing the adapters to console, it looks like this:

Searching for adapters...
Adapter: 0 (Intel® HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)


When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel® HD Graphics ...)" to the nvidia chip.

The console has changed:

Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)


On "adapter 0" it choose my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me, because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

Posted 14 February 2013 - 07:09 PM

Are you able to see your output when you query for the onboard graphics?  If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter?  I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

I noticed, when printing the adapters to console, it looks like this:

Searching for adapters...
Adapter: 0 (Intel® HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)


When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel® HD Graphics ...)" to the nvidia chip.

The console has changed:

Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)


On "adapter 0" it choose my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me, because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

Posted 14 February 2013 - 07:07 PM

Are you able to see your output when you query for the onboard graphics?  If so, then do you need to plug your monitor directly into the graphics card output in order to use it as an output for that adapter?  I haven't done much with multi-adapter/multi-output setups, so that is just a guess...

I noticed, when printing the adapters to console, it looks like this:

Searching for adapters...
Adapter: 0 (Intel® HD Graphics Family)
Adapter: 1 (NVIDIA GeForce GT 540M)


When typing in 0, the application runs on DirectX 10.1 on my internal chip.

When typing in 1, the application crashes (D3DClass::Initialize IDXGIAdapter::EnumOutputs failed)

so even if I can read out the geforce adapter, I can't use it.

Now I found a workaround by opening the NVIDIA Control Panel and changing in the 3d-settings for
this special application the preferred processor from "global (Intel (HD) Graphics ...)" to the nvidia chip.

The console has changed:

Searching for adapters...
Adapter: 0 (NVIDIA GeForce GT 540M)
Adapter: 1 (NVIDIA GeForce GT 540M)


On "adapter 0" it choose my nvidia chip and everything initialize correctly but even here, typing in "1" forces

the application to crash with the same error as above. This is strange to me, because the application can see

this "adapter 1". Is there any explanation of this behavior?

regards

PARTNERS