Archived

This topic is now archived and is closed to further replies.

Basiror

GeforceFX some questions

Recommended Posts

i have seen a listing yesterday which mentioned nv30 nv34 and a nv40( with 150 million transistors) and on a forum i have read that there will be a card with 1gb DDR2 ram further research didn t clarify anything there don t seem the be any definite tech specifications for nv40 cards doesn anyone have a list of all the gf fx models which will be released? i am thinking about getting the top model of the gffx series but theres not very much information i could find about it and what about the release date february or when?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
http://www.nvidia.com/docs/lo/2430/SUPP/PO_GeForceFX_1102.pdf

Share this post


Link to post
Share on other sites
simple said: don''t get it.

it eats more power than any other gpu
it produces more noise than any other gpu
it produces more heat than any other gpu
it weights more than any other gpu
it needs more space than any other gpu
it is not really bether than any other gpu.

that card is overclocked right from the start. wait for the r350 releases, wich are higher clocked r300''s. they will beat out the gfFX.

that card is half a year later than the ati card, yet it nearly can''t win against it. nvidia cheated on pixel quality to gain performance, see anandtech.com. on equal image quality, it looses in about every test against the radeon. and this for a higher price, too...

if you want noise that is hearable in other rooms, get that card.
if you want to heaten up your processor by a card producing up to 140°C on the backside (there where your processor is).
if you want to play games/use apps, then get the radeon instead.
cheaper,more quiet (even passive cooling version does exist!),cooler, simply smarter.

go get the videos on anandtech and let them run over your speakers. you will get scared away from the gfFX:D that sound is terrible

"take a look around" - limp bizkit
www.google.com

Share this post


Link to post
Share on other sites
1. GeforceFX isn t louder than my current cpu cooler so i don t care about the noise

2. geforce fx beats the readon in nearly all tests
and that although the readon is out for several months and has optimized drives
nvidias drivers have never been optimized at the release of the cards so there s still a lot of potential

3. the gffx5800 isn t the flagship of nvidia there are still the nv34 and the nv40(150 million transistors)

4. look at the pixel shader benchmarks amazing how much it beats the readon and that with the 5800

i don t want to know how it looks like when they release the nv40

5. atis cards still have problems with some games, my friends bough raedons a few weeks ago and it didn t run on old games at all
amazing support really as if id buy a card that only runs on the latest engines

ill go with the nv40 card when they release it
money doesn t matter

Share this post


Link to post
Share on other sites
Let''s not turn this into a flame.
It really doesn''t matter which you buy since in six months there will be another better card. Also don''t expect the features of either card being widely used until a year later, at which point there is a better card ... etc... in infinity.

____________________________________________________________
Try RealityRift at www.planetrift.com

Share this post


Link to post
Share on other sites
quote:
Original post by Basiror
5. atis cards still have problems with some games, my friends bough raedons a few weeks ago and it didn t run on old games at all
amazing support really as if id buy a card that only runs on the latest engines



My Radeon9500 barely runs any newer games at times. The drivers are WELL below par, the texture corruption is ugly, the stuttering is poor beyond belief.

If you buy nVidia, youre getting something you know will work. If you buy ATI, youre taking a BIG chance. Everyone I know that owns a Radeon, including myself, have a LOT of problems with them. From visual artifacts, polygons turning translucent in Halflife, complete system lockups, texture corruption, stuttering which can make games totally unplayable... Radeons just arent fun

-----------------------
"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else''s drivers, I assume it is their fault" - John Carmack

Share this post


Link to post
Share on other sites
hehe, to all nvidia supporters (not that i dont think the card is good.. its just not something special, or worthwile), please read anandtech and those carefully, read up the forums, etc. then you would know that the card only performs really bether in
a) syntetic benches
b) with cheated quality modes

and hey, nvidia talks themselfes about 2x the speed all half year. its half a year after the radeon9700pro. in most game benches we see.. at max 10% speed boost, and in (imho much too much tests) it is even slower.

drivers or not, that card should run much bether, compared to the overclocked fancy stuff they made over it.
remember its .13, not .15 => much much cooler. still, they need a freaking hairdryer to cool it down, and still then, its much too hot for a normal case to run long time.

the ati chip can get passive cooled.

just think of the ati chip with the same .13 technology, with the same hairdryer, overclocked the same way. then you would get speed.

and the radeon line up will be there at march, providing about 40% + performance enhancement.

simply do the math, dudes, and you have to realize this card is not at all the holy grail.

choosing between this card, and a about 10% slower card, wich costs much less, does not make any noise, and does not heaten up my case, _and_ uses much less power, is not really a question to think about.. really not.

but instead of flamewaring in here, please just read. its all discussed yet enough.
the card is hot air, over 60°C of it at the backside. end of story.

"take a look around" - limp bizkit
www.google.com

Share this post


Link to post
Share on other sites
quote:
Original post by Maximus
If you buy nVidia, youre getting something you know will work. If you buy ATI, youre taking a BIG chance. Everyone I know that owns a Radeon, including myself, have a LOT of problems with them. From visual artifacts, polygons turning translucent in Halflife, complete system lockups, texture corruption, stuttering which can make games totally unplayable... Radeons just arent fun


I'd dissagree strongly to that.
I moved from a geforce 1 to a radeon 9000 pro and overall I've been very impressed by the card.

put it this way,
it took me 3 months to get my geforce stably under windows xp when it came out. Nvidia didn't do squat for me, or for 1000s of people who suffered similar problems.

I honestly took insane amounts of effort getting the card stable.
I remember times where I'd need to download a good 3-5 driver packs to get something that actually worked.
So don't say that nvidias are always stable as rocks.


"From visual artifacts"
thats pretty general

"polygons turning translucent in Halflife"
never even heard of this, or seen it. On mines or other people radeons.

"complete system lockups"
I've had a couple on this chip but no where near the number on the geforce.

"texture corruption"
it's true that I've noticed issues radeons have with corrupting DXTC textures when the memory is clocked too high or isn't being cooled adequatly, but once again is nothing compared to the corrupting I've seen occur on the geforce..

"stuttering which can make games totally unplayable"
I've seen this more on geforce 2mx's myself. It does seem to effect radeons running windowed direct3d apps too. I've also read that when the performance counter is queried and there is a lot of data being transfered over the pci system bus, a significant stall can occur, wildly effecting the accuracy of the counter (or somethign like that) making it appear that the game stalled badly... (seen this type of symptm more on athlons myself).

and don't get me started on nvidias attitude to ramdacs...



[edited by - RipTorn on January 28, 2003 8:48:59 AM]

Share this post


Link to post
Share on other sites
I currently have a GeForce4, and I will NOT get the FX.

I am so far happy with the GF4, with the exception of the 120 degrees on the backside of it.

Is it worth to check out the ATI? Or should I wait a tad until I find the next step in Grafx cards?

Share this post


Link to post
Share on other sites
y''know, the question just dawned on me:

why the hell arent motherboard manufacturers putting the
AGP port farther away from the CPU?
dont they know these cards get super hot?
heh.. just thought i''d ask

-eldee
;another space monkey;
[ Forced Evolution Studios ]


::evolve::

Do NOT let Dr. Mario touch your genitals. He is not a real doctor!

Share this post


Link to post
Share on other sites
i currently have the ati radeon9700pro, the currently fastest, best and most advanced gpu available (and that since half a year:D)..

and yes, it really is an impressive card..


about the texture compression corruption. ever played q3 on a geforce level card? looked at the sky? now _THAT_ is crappy compression:D (known hardware bug, they could never work around it.. it simply looks ugly, that decompression:D)


there will always be people having stable nvidia cards, stable ati cards, instable nvidia cards, instable ati cards.

but lately as i followed, the nvidia drivers had more and more funny bugs, and nvidia drifts more and more away from opengl and dx always designing their own way to go. that alone gives me enough support to ati. if i code something that has to work on all future gpu''s on dx9 level, i can''t code for gfFX, but for the radeon, without problems. so i''m quite happy with it, yes..

and no problems, no. the first drivers had some fancy corruptions, yes, but thats over now, at least on my pc.

"take a look around" - limp bizkit
www.google.com

Share this post


Link to post
Share on other sites
the fact that most people with raedons i know are really disappointed of ATI can be generalized on the entire game community
a manufacturor whose cards are so instable doesn t gain very much trust and that the reason why they will buy nvidia if they buy something at all

and when you have a look at the pixel shader benchmarks you see where gffx`s strength is and just to node that the 5800 isn t the flagship there s still the nv34 and nv40 coming

so when i want to code an engine that runs on as many computers as possbile i would certainly not stick with ATI there are too less people with ATI cards, that it is worth spending any money on their cards


and concerning the drivers maybe nvidias drivers have funny bugs, none of those bugs make a game unplayable, in the contrast to ATI where old games which use opengl1.0 don t run without artifacts
it s a shame to release such a card
not to mention that all my friends gave the cards back and took the money or a gf4ti

no reason to stick with ati

Share this post


Link to post
Share on other sites
the exciting thing is not the clock speed, but rather the impressive vertex & pixel shader features, which outclasses ati''s offering.

also, i heard it the other way around about sky bugs:
http://212.100.234.54/content/1/16024.html

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by sjelkjd
the exciting thing is not the clock speed, but rather the impressive vertex & pixel shader features, which outclasses ati''s offering.

also, i heard it the other way around about sky bugs:
http://212.100.234.54/content/1/16024.html


The problem with these impressive pixel and shader features is that the current Geforce FX doesn''t have the memory bandwidth to do an effective job of preforming all these operations.

Current benchmarking utilities won''t show this problem as they were most all written for use with DX8. DX8 Shaders are significantly smaller in size than what the DX9 spec requires.
Nvidia is hyping the fact that the GF FX can preform more shader code than DX9 specs require, however the 128 bit memory bus is the limiting factor. It is theoretically possible for a large shader (dx9 size or GFFX size) to cripple the cards rendering capabilities. All Nvidia''s using a 256bit memory bus, if Nvidia had used this bus size instead of the 128bit there would be no doubt that the card would''ve been significantly faster.

The bottom line is it looks like this product was modified from it''s origional design after Nvidia saw the huge leap in preformance over their Geforce 4 video cards. They got caught with their pants down and are trying to make up for it.

There is honestly more head room available in the ATI line of cards than there is in the GeforceFX. The 9700 pro is still using regular DDR memory, and a .15 micron die size. And the 9900 pro plans to use the same .15 micron die size and preform 10% faster than the GF FX and it will run at 400mhz with no need for a giagantic fan. There are alot of hardware improvements that this line of cards can go through, a shrinking of the die to .13 micron will allow clock speeds of over 500mhz, and the addition of GDDR3 will provide faster memory speeds. Include the fact that there is still room for preformance increases in the drivers there is no doubt that ATI''s got this round in the bag.

Nvidia on the other hand has already gone with the .13 micron process, and they are already using GDDR2. The only real thing left for them to improve is the memory bus available, increased clock speeds, and improved driver preformance. They really have no where to go with this line of chips. the NV34 as was previously mentioned will be a slower card than the current, and the NV40 is still some time away. Nvidia''s really got to look into a cooling solution which is less cumbersome than the current "dust buster" design before they can continue to clock their cards at higher speeds.

If your one of those people looking for the ultimate overclocking card I''d suggest you pass on the GFFX and wait for the hercules Radeon 9900 pro card that''s going to ship with a water cooling setup. It''s hyped to be the ultimate "overclocking" card.

Share this post


Link to post
Share on other sites
ITS VIDEO CARD WAR TIME!!!

Of course I value GAMEPLAY over graphics. Have you seen the screens for Deus Ex 2? Everything is made of PLASTIC!!! So much for advanced graphics. Hopefully the gameplay is >= the first one

People keep trying to make things look more realistic, but it end up looking even faker instead of working on gameplay

I think its great they are working on such a powerfull card, but is there demand for it? My Radeon 8500 works just fine

BTW I like the cooling system. Any serious heatsink takes up the PCI slot, so why not use it for a blower?

Share this post


Link to post
Share on other sites
I''d just like to remind everyone that the GeforceFX isn''t actually out! It''s only the prototype, which is bound to have problems. Why don''t you wait until the real thing is out before you judge it?

Share this post


Link to post
Share on other sites
I never really understood why ppl want to achieve FPS''s greater than their monitor refresh rate.

As for quality, until someone makes a realistic lighting engine (DOOM 3 is a step in the right direction), the anti-aliasing and other ''quality enhancing'' features are really just gimmicks, all they do is reduce jagged lines, sure they soften the image and make it a little more pretty. But ultimately I''m not going to notice, I paid 50 dollars to play a game, sight seeing is over and done with after the first hour or so.

I have a GForce Ti4400 which I bought when they first came out, and to this day I have 1 game that takes advantage of the cards features.

9700? 5800 Ultra? Please.....

If you already have a powerful GPU save your money. Its going to be a while before the mainstream of games uses all the advanced features. When that happens buy a 5800 Ultra, by then they will cost about 100 dollars.

Share this post


Link to post
Share on other sites
quote:
Original post by Basiror
the fact that most people with raedons i know are really disappointed of ATI can be generalized on the entire game community
a manufacturor whose cards are so instable doesn t gain very much trust and that the reason why they will buy nvidia if they buy something at all

and when you have a look at the pixel shader benchmarks you see where gffx`s strength is and just to node that the 5800 isn t the flagship there s still the nv34 and nv40 coming

so when i want to code an engine that runs on as many computers as possbile i would certainly not stick with ATI there are too less people with ATI cards, that it is worth spending any money on their cards


and concerning the drivers maybe nvidias drivers have funny bugs, none of those bugs make a game unplayable, in the contrast to ATI where old games which use opengl1.0 don t run without artifacts
it s a shame to release such a card
not to mention that all my friends gave the cards back and took the money or a gf4ti

no reason to stick with ati


What are you saying? Are you perhaps paid by nVidia to create a hype? I have only one game that does not work on radeon (heavy gear2) and that is way old. The rest of the games do work and I have seen no trouble whatsoever. FX offers a measle increase in performance and hardly justifies an upgrade, a double or forget it. The increase can only be measured to its gain and hardly noticed visibly. The extra functionality won't be widley used for a year (at least) and the extra functions that go beyond Dx9 is useless as that cannot be used by a developer who want to make sure the software works on the majority of machines out there regardless of card preference. I for one would hate to go back to 3DFX only days. As for OpenGL conformance, yes nVidia has better GL support and that is hardly surprising since nVidia is created by SGI members (can anyone say "The creators of OpenGL! Hello, Mark J. Kilgard!" ?) but on the other hand ATI has the best conformance to DX. Now, which of the two do you think is most important? ID's engine games(OpenGL) or the rest of the world(DirectX)? nVidias DX 9+ just proves further that they can't stick to conformance. I know I am generalizing but it really bugs me when people can't accept that there is another alternative then what they want. FYI I have both nVidia and ATI cards, so don't throw some -you are just an ATI fan BS on me.
Open your mind instead and see the possibilities in both brands.


[edited by - MichaelT on January 28, 2003 3:11:18 PM]

Share this post


Link to post
Share on other sites
I have a GeForce 4 Ti4600 and I''m VERY VERY pleased with it so I''m not going to get a GeForce FX (yet). I think the nVidia cards are better for games then the ATIs because they are faster.

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
quote:
Original post by Fuzztrek
I''d just like to remind everyone that the GeforceFX isn''t actually out! It''s only the prototype, which is bound to have problems. Why don''t you wait until the real thing is out before you judge it?


It''s not exactly a prototype. Nvidia''s claiming to be able to launch this thing in the next couple months which means by now the chip''s been finalized, the parts providers have been secured, and the cards need to be currently in production for them to have enough stock for the initial release. The only thing that happens at this time in the hardware development process are driver tweaks, and Quality assurance. There aren''t going to be any major changes between this "prototype" and the final card other than the drivers, and possibly a higher clock. But the higher clocking doesn''t seem logical as they''ve made it very apparent that they are having trouble cooling the chip. Even with that gigantic cooler on the card it''s running much hotter than expected.

Share this post


Link to post
Share on other sites
quote:
Original post by MichaelT


The extra functionality won''t be widley used for a year (at least) and the extra functions that go beyond Dx9 is useless as that cannot be used by a developer who want to make sure the software works on the majority of machines out there regardless of card preference.
.....
Now, which of the two do you think is most important? ID''s engine games(OpenGL) or the rest of the world(DirectX)?

[edited by - MichaelT on January 28, 2003 3:11:18 PM]




1. i use the card`s feature in my engines
and don t care if it works on other peoples machines once it is done those features will be standard on the latest hardware
and there s still the possibility to set minimum requirements

2. i only play ID engine games i am an old quaker
and my engines will be quake like as well
as long as i don t fine someone who write the D3D part of my engine i won t implement D3D because i am not willing to learn this microsoft uglyness



actually you should be happy that a company like nvidia invents new features all the time
where would be nowerdays if there weren t forerunners?

with buying a nvidia card i know i bought quality and i know it runs without problems
something ATI can t claim it s a shame that the readon doesn t run with so many older games and that certainly doesn t make the customers trust ATI
in my eye ATI is a cheap card for the masses and it s always been that .

Share this post


Link to post
Share on other sites
Wow, what a thread. I think this is the only thread where I''ve seen Nvidia vs. ATI, OpenGL vs. DirectX and Gameplay vs. Graphics all come up.

Btw, limiting yourself to only id engine games (even if you only play first person shooters) is keeping yourself away from a lot of good Lithtech\Unreal\other engine games. And D3D in DX8 and DX9 isn''t ugly at all, it''s about as nice to use as OpenGL (nicer if you''re using anything that requires an extension.)

Share this post


Link to post
Share on other sites
quote:

The problem with these impressive pixel and shader features is that the current Geforce FX doesn''t have the memory bandwidth to do an effective job of preforming all these operations.

Current benchmarking utilities won''t show this problem as they were most all written for use with DX8. DX8 Shaders are significantly smaller in size than what the DX9 spec requires.
...
It is theoretically possible for a large shader (dx9 size or GFFX size) to cripple the cards rendering capabilities.


What, are you planning on reloading the shaders each frame? What does the size of the shader have to do with memory bandwidth? If anything, a large shader is better for low bandwidth, since the GPU spends less time waiting for vertices to transfer.

quote:

but on the other hand ATI has the best conformance to DX.


That''s pretty idiotic. The nvidia card supports a superset of DX9; any DX9 program will work with GFFX.

Share this post


Link to post
Share on other sites
lol i will be sticking to my ti4200, i wouldnt like to try putting an fx inside my little shuttle, i think the psu (the size of a floppy drive) wouldnt be able to cope, its currently running a p4 2.4B, 512mb ddr333 cas2, 2 120G 7200rpm hd, dvd/cdr, and ti4200, using on board lan/audio.

Share this post


Link to post
Share on other sites