Jump to content
  • Advertisement
  • entries
    232
  • comments
    1463
  • views
    961837

ATI, NVidia

Sign in to follow this  
Ysaneya

405 views

I've been pretty disapointed by ATI in the past year. It looks like the golden age of ATI is done. Since the Radeon 9700, ATI has improved their drivers a lot, and the developer support was excellent and responsive. But since a year, i've submited driver bugs twice and have been ignored - a simple email stating "we're looking into this" would have been enough. But no, nothing. And i'm a registered developer.

I wonder how ATI or NVidia are testing their drivers. Recently i saw a post on opengl.org, where somebody was reporting a driver bug in one of the latest NVidia drivers.. in a demo that was posted on their own developer site! You'd think that before releasing a new driver, they'd test it with all their available samples, demos and games, but it does not look like it is the case.

The last problem i had on my ATI ended up being a driver bug.. introduced in the Catalyst 5.4 driver. Hello Mac Fly ? Aren't you supposed to fix bugs instead of introducing new ones ?

Nowadays, it looks like the competition between NVidia and ATI is raging. They are rushing their drivers to be more up-to-date than the competitor. Seriously, how often are new drivers released ? Every month or so.

Yesterday i was reading the technical description of the future X1800, in the Radeon SDK. I'm not very impressed. ATI is catching up. In addition, in all the benchmarks i've seen, it doesn't look much faster (and even often slower) than the current Geforce 7800. And the 7800 still has more features.

So, we'll be missing floating-point textures filtering. According to ATI, this is no big deal because it can be implemented in a pixel shader.. in addition, vertex texturing is missing in VS 3.0, because "anyway it's too slow, nobody's gonna use it". Hello again ? Who's the developer ? Me or you ? So please don't tell me what i need or not. I can imagine quite a few uses of vertex texturing, even if it's slower than standard textures. If your hardware does not support it, that's one thing, but don't try to convince me that it doesn't matter or that it's not important, especially with the propaganda and marketing bullshit that has contaminated even your technical documentations.

Anyway, i'm currently working on the integrated terrain engine (no i'm not sleeping), but i have no new screenshot to post.
Sign in to follow this  


4 Comments


Recommended Comments

Quote:
I've been pretty disapointed by ATI in the past year.

Yeah, I get a similar feeling... but I've also noticed that, for whatever reason (e.g. XB360 co-development), the latest R5xx generation seems to have had a unstable and somewhat troubled beginning to life [lol]

Maybe this'll be a bit of a stumble and they'll come back on form again with their next major part (D3D10 maybe?).

Quote:
They are rushing their drivers to be more up-to-date than the competitor.

This might well help the end-user make their games look pretty and maybe go faster, but it does seem to be a negative for the development community..

Even if it just means that the number of potential test configurations sky rockets [oh]

Not sure if you'll of seen it, but we had a good discussion on the whole vertex texturing thing in the DX forum here.

Share this comment


Link to comment
Still no vertex texturing? That sucks. I was majorly pissed off at nVidia when I was experimenting with GLSL for the first time (on an FX5xxx card). Their drivers will happily report 2 vertex texture units available, but any attempt to bind or use them fails - they are in fact totally absent. The daft thing is that 0 is a valid min spec for GLSL's vertex textures - why they didn't just return 0 instead I don't know.

Share this comment


Link to comment
ATI's Linux driver support is still absolutely fucking abysmal, and I refuse to support them until they fix it. As a consequence, my friend is head of IT for a large CG firm, and also refuses for the same reason (they migrated from Maya on NT to the same on Linux and bought new NVIDIA hardware).

Share this comment


Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!