Do you recommend a SM 3.0 graphics card?

Started by
32 comments, last by lancekt 18 years, 5 months ago
Quote:Original post by rollo
SM3 also means allows you to do texture sampling in the vertex shader which can be useful for displacement mapping and other things.

*cough* Not quite...gotta love loopholes in the specs

Quote:I wouldnt' spend alot on one though, DX10 parts are just around the corner ;)

Exactly. Just be prepared to upgrade again to a brand new card if you want to use DX10 when it comes out. Personally, I was planning on upgrading to a new SM3 card when the new ATI line came out, but the lack of vertex texturing made me decide not to.
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
Advertisement
Quote:Original post by circlesoft
Quote:Original post by rollo
SM3 also means allows you to do texture sampling in the vertex shader which can be useful for displacement mapping and other things.

*cough* Not quite...gotta love loopholes in the specs


Shame on you ATI!

Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...
Quote:
*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?
Orin Tresnjak | Graphics ProgrammerBethesda Game StudiosStandard Disclaimer: My posts represent my opinions and not those of Bethesda/Zenimax, etc.
Quote:Original post by lancekt
Quote:
*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?

Here is a thread where we talked about it a little bit. What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).
Dustin Franklin ( circlesoft :: KBase :: Mystic GD :: ApolloNL )
Quote:Original post by Moe
Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...

Not until Windows Vista appears which is pretty unlikely to be earlier than October next year. I'd guess the earliest we'll see any DX10 hardware is in time for Christmas 2006.

Game Programming Blog: www.mattnewport.com/blog

Quote:Original post by circlesoft
Quote:Original post by lancekt
Quote:
*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?

Here is a thread where we talked about it a little bit. What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).


Correct me if I am wrong but is the current NVIDIA implementation anything more than just a checkmark on a features list? From what I've heard VTF on NVIDIA cards is almost impractical to use because of the slow speed and its pretty limited too (only point filtered? special texture formats?).

AFAIK Only one game supports it (IL2 for water), by default it is disabled and enabling it causes a massive drop in framerate (50+ %).

I agree that it would be nice to have but given the limitations of NVIDIAs implementation, ATi's alternative R2VB is looking a lot better (especially since it can be supported back into the R300s).
I wouldn't say they are completely impractical: Terrain rendering using GPU-based geometry clipmaps. They managed to speed up their algorithm quite a bit using vertex texture fetches, although they found them to be the bottleneck in their implementation.

I can't really say which has the best idea, nvidia or ati, and it seems to boild down to which of these you prefer:
- have similar features in vertex shader and fragment shader (nvidias VTF)
- keep vertex shaders simple and leverage the fact that there are usually more fragment processing units than vertex units on the card and go the ATI way.
Quote:Original post by mattnewport
Quote:Original post by Moe
Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...

Not until Windows Vista appears which is pretty unlikely to be earlier than October next year. I'd guess the earliest we'll see any DX10 hardware is in time for Christmas 2006.


Hardware may appear before Vista. Of course, you will have to upgrade OSs to use the DX10 features.

EvilDecl81
Quote:Original post by circlesoft
What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).

TBH, if I could only have one, I'd rather have render to vertex buffer than vertex texture read... the former is a superset of the latter's functionality, and even with the additonal pass, it's still going to be way faster just because of the way that current GPUs are designed (once they all have unified shaders, filtered reads in the vertex shader will be just as fast as the fragment shader).

That said I agree that having to write a different code path is really annoying.

Still, the lack of fp16 filtering is my biggest complaint about the new ATI cards. I don't mind if they do it in shader code internally (better to have programmable hardware than static interpolators that can get wasted!), but they NEED to do that for the developers at some point. Sure bilinear is easy to write oneself, but aniso and mipmapping? Possible, but annoying...

This topic is closed to new replies.

Advertisement