# GF 4 MX to GF 6200

This topic is 4352 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

To start off I'm not where to post this - so forgive me. I currently have a nvidia geforce 4 mx (440), and I am thinking of upgrading to a geforce 6200. Will I see a huge difference, eg playing NFS:MW? Does the 6200 support vertex programs etc...? And what about the 6600? My pc only has AGP, not pci-express. The rest of my system: 2.8Ghz cpu and 1 GB ram running win XP SP1. Any personal experiences would be great.

I was actually in just the same situation ( accept I had a worse card ). I decided that instead of paying out lots right now, since I only have an AGP slot, I would get a cheaper card ( but still good ) now and later when I upgrade my computer to a PCI Express I will get a high powered one. The card I just bought is the GeForce 5500. You can get it on TigerDirect.com for 45$after the rebate, which is really nice. The card has 256 MB of RAM, and supports Vertex and Pixel Shaders version 2.0. There will also many great reviews for it. I would definitly recommend this card, unless you want to dish out the big bucks right now. #### Share this post ##### Link to post ##### Share on other sites I would STRONGLY suggest you upgrade to a motherboard with a PCI-E slot. MAJOR speed differences. I just got a PCI-E 6200, and it's AMAZING compared to your MX 4. Supports OpenGL 2.0 V & F Programs completely! Very good. Stay with something cheap, DX10 is around the corner, and it'll require a completely new kind of card aparently. #### Share this post ##### Link to post ##### Share on other sites You will probably see a huge difference, yes. The 6200 and 6600 are both great for the price, at least on PCIe -- i haven't checked AGP prices in a while. #### Share this post ##### Link to post ##### Share on other sites Yes, the 6200 is significantly better than the 4 MX (as long as it's the real 6200, not the "TX" "shared memory" version). The 6200 supports GLSL and HLSL, including shader model 2.0 and 3.0. It won't run 3.0 the fastest possible, but it supports it. The 6600 (especially the GT version) is a great mainstream card, with good performance for all current games. #### Share this post ##### Link to post ##### Share on other sites I just upgraded from a geforce 2 MX to a 6200.(its actually a new computer, so I also got 512mb more ram and .5ghz better processor) The difference was *amazing.* I went from running oldish games at the lowest settings and getting 10 fps to running Half-Life 2 at max, except for AA and anisotropic filtering, at 30+ fps. I *love* it. It supports all current games with pixel shaders, seeing as it has 3.0 support. BTW, its was in the 256mb flavor and made by EVGA, if you really wanted to know. A great upgrade. Quote:  ...and it'll require a completely new kind of card aparently. Someone will get *very* angry if it does... #### Share this post ##### Link to post ##### Share on other sites Quote: Original post by Ezbez Quote:  ...and it'll require a completely new kind of card aparently. Someone will get *very* angry if it does... That's too bad about the anger, cause this has been known for at least a year. #### Share this post ##### Link to post ##### Share on other sites I got an AGP 6200A and it's pretty good. It has the latest VS & PS versions - as seen in the$300 cards - but just not as quick.

The PCIe version you have to watch out for the shared memory (turbocache) versions which is a bit tricky, but they are a LOT slower.

##### Share on other sites
If you were running a GeForce 4 MX, it is entirely possible (actually, somewhat likely) that your motherboard does not support AGP 3.0, and thus couldn't use a 6200. So you might need to replace your mobo anyway.

##### Share on other sites
IIRC the GF 4MX actually was a modified GF2, so I think you will se a huge improvement. But I just got a 6600(AGP and not GT) and even if performances are more than enaught for my pourposes (I plan to buy a new pc in a year anyway) I would not suggest anything less than it. I don't know how much is the 6600 better than the 6200, but the 6600 should is quite cheap in the USA, don't know in australia...

##### Share on other sites
Note that, because DX10 requires a whole new card (it really does), it will take a long time before games require DX10. DX10 can be viewed as a whole new platform (much like, say, "Xbox" or "MacOS X") that developers can choose to support or not support. Thus, we'll likely see games with both DX9 and DX10 support LONG before we'll see games that require 10 -- that's probably three years out.

That being said, Windows Vista, and its pretty GUI, will be built on top of DX9, so a DX9 card will certainly be sufficient to "stay with it" for a while yet.

##### Share on other sites
For budget graphics power both the 6200 and 6600 cards are great. ATI has been holding its own on the high end, but the best budget cards have always been the domain of nVidia. You'll see a huge improvement with either card, and both support the latest shader model. Be carefull to compare actual features when making the final buying decision, not all 6200s or 6600s are created equal:

Core speed - faster is better

RAM type/speed - GDDR3 > GDDR2 > GDDR > DDR, and again faster is better.

Memory bus width - No less than 128bits, 256 is better, but you'll only find that on the more expensive 6600s. Overall bandwidth is a product of bus width and memory speed.

Overclocking - If you're interested in overclocking the card look for hardware reviews of the specific card you're looking at. Reviewers usually have at least a page about how well the card overclocks, although doing so will void your warranty so you do so at your own risk and your mileage may vary. Also, some 6600s, particularly GTs, can "unlock" disabled vertex and pixel pipelines provided that they have no flaws, essentially making them slightly slower 6800s. Unlocking pipes is among the safest "overclocks" because you don't need to mess with speeds or voltages so excess heat is less of an issue, not to mention extra pipes will give you a bigger boost than another 25mhz will.

##### Share on other sites
Quote:
 Original post by hplus0603Note that, because DX10 requires a whole new card (it really does), it will take a long time before games require DX10. DX10 can be viewed as a whole new platform (much like, say, "Xbox" or "MacOS X") that developers can choose to support or not support. Thus, we'll likely see games with both DX9 and DX10 support LONG before we'll see games that require 10 -- that's probably three years out.That being said, Windows Vista, and its pretty GUI, will be built on top of DX9, so a DX9 card will certainly be sufficient to "stay with it" for a while yet.
Really? I'd assumed it would be built on DX10 not DX9. Although using DX9 would allow Vista to actually run on existing PCs - just no DX10 app would work.
I'd imagined it would use DX10 for the GUI etc, and no existing PC would even run the OS!

##### Share on other sites

This topic is 4352 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628662
• Total Posts
2984095

• 10
• 9
• 9
• 10
• 21