D3D9 doesn't recognize a GPU without a monitor
Hello everyone,
I have a 2 GPU system running Windows Vista. The second GPU is used for computing (using D3D9) and shouldn't have a monitor plugged. The problem is that as long as there's no monitor, D3D doesn't recognize the GPU (GetAdapterCount() returns 1).
Why is this and how can the problem be solved?
Thanks.
All cards I've tried: 8800GT, 8800GTX, 9800GT...
I'm pretty sure this wasn't a problem when I worked on XP, only Vista.
nVidia actually acknowledged this problem but they claim it's an OS issue. I'm guessing I'm not the first one to have this problem, so someone must have a solution...
I'm pretty sure this wasn't a problem when I worked on XP, only Vista.
nVidia actually acknowledged this problem but they claim it's an OS issue. I'm guessing I'm not the first one to have this problem, so someone must have a solution...
If waiting for a fix is an issue (and it usually is) then I'd suggest you look at Gefen's DVI Detective.
It's basically a dongle connected to a DVI port which make it seem as if a monitor is attached. You capture display EDID information by briefly attaching the monitor to be emulated, after that no power is require.
I use them to prevent projector disconnections from making XP disable the screen.
It's basically a dongle connected to a DVI port which make it seem as if a monitor is attached. You capture display EDID information by briefly attaching the monitor to be emulated, after that no power is require.
I use them to prevent projector disconnections from making XP disable the screen.
I believe this might be by design. I know in Vista they added a feature called "jack detection" to the audio stack which would disable your soundcard (kind of) if no speakers were plugged in. I got bit by it because I don't usually have speakers plugged into my computer at work, but sometimes I'm asked to check whether a piece of streaming media is working. I don't care if there's any actual sound, just whether Windows Media Player gives an error or not -- but with no speakers plugged in, I always got an error.
I was told the number of people who want an error when they try to play and sound and no speakers are plugged in far outweigh the number of people who want it the other way. In the end, I just bought a pair of the cheapest headphones I could find and left them plugged in.
I suspect a similar thing may be happening here. With no monitor plugged in, Windows is disabling that screen so that you don't get an area of your desktop that you can't actually see...
I was told the number of people who want an error when they try to play and sound and no speakers are plugged in far outweigh the number of people who want it the other way. In the end, I just bought a pair of the cheapest headphones I could find and left them plugged in.
I suspect a similar thing may be happening here. With no monitor plugged in, Windows is disabling that screen so that you don't get an area of your desktop that you can't actually see...
Quote:Original post by Racky1275
If waiting for a fix is an issue (and it usually is) then I'd suggest you look at Gefen's DVI Detective.
Thanks for the suggestion, I'll give it a try.
Quote:Original post by Codeka
I suspect a similar thing may be happening here. With no monitor plugged in, Windows is disabling that screen so that you don't get an area of your desktop that you can't actually see...
But it's not an area of my desktop and I don't want it to be.
GPGPU has been around for some time now. You would expect a new OS like Vista to account for having a GPU just for computing and not for display.
Quote:Original post by eigers
But it's not an area of my desktop and I don't want it to be.
GPGPU has been around for some time now. You would expect a new OS like Vista to account for having a GPU just for computing and not for display.
Yeah, I understand that, I was just trying to provide a possible explanation - not saying I agree with it, obviously :-)
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement