Problem acquiring requested standard resolution

Started by
2 comments, last by Mxlprx 18 years, 8 months ago
Hi all, I got a problem acquiring the correct graphics driver mode for a given resolution. The PC is connected via DVI connection to a wide screen 50" TV. The Windows Desktop is set to 1024x768x16 resolution. In my application, using the GetDeviceMode DirectX-function, I am polling the available enumerated display drivers (driver #0 is reported as "Primary Display Driver", driver #1 is reported as "RADEON X800 Series") for the supported resolutions. Unfortunately, DirectX does not report any resolution of 1024x768, in fact, most reported resolutions are wide screen formats, apart from 640x480 and 800x600. But, as said before, the Windows Desktop is being displayed on the TV with the resolution set at 1024x768x16, so the drivers seem to be capable to display that resolution, just DirectX does not report this res being available with the TV being connected. When a 15" monitor is attached using the VGA connector though, 1024x768 is reported as being available. I am a bit lost here, anyone have any idea about what might go wrong in my application ? Thanks, Manfred
Advertisement
I'm at work and cannot look it up in the SDK docs right now, but I believe whether a mode is available depends on more than just the resolution -- a the very least bit depth and refresh rate play a role. So, you might be asking for a supported resolution but with unsupported depth and rate.

Secondly, I doubt the TV runs that resolution. Probably your card does some nice converting. It might be that this conversion is done at a high(er) level and as such DX misses it. Just a guess.

Illco
With GetDeviceMode as it is used in my application, DirectX reports which resolution/bit/refreshrate you get with driver mode 0,mode 1, 2, 3, 4,.... and you can compare it with what you requested and find the proper mode this way.
But as stated in the first email, the 1024x768 resolution is not being reported as available, regardless of refresh rate.

Windows Display Setup reports at "All Modes" that 1024x768 would be available, so

However, the TV itself has a PC mode that tells that the incoming signal is 1024x768.
Some new info:

The user has a 15" monitor, so used this as primary display device for a try. And of course, he was able to get into the UI this way. He ALT-Tabbed out of the application, changed the primary display device to the TV et voila - he was able to run it on the TV now with 1024x768x16. He also could do mode switches from within the app and so on, so now DirectX reported the 1024x768 even with the TV connected. But it doesn't in the first place.

Anyone any idea ?

This topic is closed to new replies.

Advertisement