Jump to content
  • Advertisement
Sign in to follow this  
faculaganymede

Two graphics cards vs. One dual-output card (PC/DVI)

This topic is 4522 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Graphics gurus, I am new to PC graphics cards and need help making a purchase decision. I need a high-end graphics card(s) for generating two channels of DVI outputs from a single PC. High dynamic range and frame rate are important. The card(s) will be used for 3D graphics simulation. Should I get two cards or a dual-output card? Any comments, suggestions, or recommendations on specific cards are appreciated?

Share this post


Link to post
Share on other sites
Advertisement
I'm pretty sure that two graphics cards would be faster. Don't take my word on it, but thats what I think.

Share this post


Link to post
Share on other sites
Most likely it sounds like you're going to need two cards, or one of the asus dual-GPU cards which basically have 2 gfx cards on one board. I'd recomend 2 distinct cards, unless your system is limited to only 1 slot for graphics expansion. Other than that, you probably want at least something from the GeForce 6600, 6800, or 7800 lines, or an ATI X800, X1800 or one of their crossfire enabled cards.

Are you looking for an SLI solution where the two cards drive a single monitor or are you driving two monitors? Also, what resolutions are you planning to run? What type of graphics expansion does your computer support? PCI-Express (1 or 2)? AGP?

Share this post


Link to post
Share on other sites
Ezbez & Ravyne, thank you for your replies.

Quote:
Original post by Ravyne
Are you looking for an SLI solution where the two cards drive a single monitor or are you driving two monitors?

The two DVI outputs will be to two separate monitors. We'll be running two simulation simultaneously (or rather, one simulation with two different scenes, maybe I am not using the terminologies correctly here).

If I use two cards, how are the cards synchronized? What's the latency? (Sorry, I know these are very technical questions, just in case someone knows).

Quote:
Original post by Ravyne
Also, what resolutions are you planning to run?

Something like 512x512, 640x480, 256x256, etc. We are not sure yet.

Quote:
Original post by Ravyne
What type of graphics expansion does your computer support? PCI-Express (1 or 2)? AGP?

We don't have the computer yet. We need to purchase both the computer and the card(s). Basically, we want the best performance (meet the requirements) at a reasonable cost.


The most important requirement for the card is the dynamic range of the DVI output.

ATI x1800 spec states "16 bit per channel floating point HDR and 10 bit per channel DVI output." I need some help here.

Does "16 bit per channel floating point" mean 16 bit dynamic range for each of the RGB channels in the frame buffer?

Does "10 bit per channel DVI output" mean 10 bit dynamic range for each of the RGB channels in the DVI output? If yes, is 10 bit per RGB channel currently the best there is for PC graphics cards?

I couldn't seem to find this kind of info in the Nvidia card specs.


[Edited by - faculaganymede on January 5, 2006 11:53:13 AM]

Share this post


Link to post
Share on other sites
A new PC should come with PCI-express. I hope this lets you just install any 2 PCI-express video cards and Windows will treat them as such - you could do this with old PCI cards. So I'd recommend finding a decent card for what you need and buying two!

Share this post


Link to post
Share on other sites
I'm fairly sure that one card cannot render two different scenes to two different monitors (unless it has, as Ravyne mentioned, two GPUs)... well, actually, maybe it can (with a little fudging). You'll certainly get more performance out of two cards/two GPUs, though.

HTH and cheers!

Share this post


Link to post
Share on other sites
faculaganymede,

"16 bit per channel floating point" means it supports 64-bit surfaces in the hardware for processing during the rendering of a frame - this is good and is mostly what HDR is about.

"10 bit per channel DVI output" means that the final output of the card is 10 bits. It's my understanding that even after all the fancy internal work of rendering a frame on a card, even if it is in high definition 64-bit or 128-bit colour, that it all gets downsampled to 8-bit (or 10-bit in this case of DVI) for the final framebuffer output, and happens automatically on the graphics card (is it the DAC that does this?) somebody with a bit more raw hardware knowledge might have a better answer.

As for syncrhonisation, there is a little connecting plate that goes between the two cards in my machine, I'm guessing this does the sync and load distribution work.

Finally, I'll add that at work I have dual Geforce 7800 GTXs and when SLI is enabled multi-monitor support no longer functions for me. It is somewhat annoying.

-Mezz

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!