Sign in to follow this  

Do you recommend a SM 3.0 graphics card?

This topic is 4393 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello! I have been programming in Direct3D for about 2 or 3 months and I am progressing at a rate I consider quite incredible. But soon, the limitations of my ATI Radeon 9200 will be evident and that will slow my progression. So I'm planning on getting a new graphics card for Christmas. Currently, I'm not using Vertex of Pixel Shaders, but I'm quite sure that I'll start using them in less than... lets say 4 months. I had had some bugs of unsupported Shader Model version with my current card and I desire not to have such trouble with my new card. 1) Will having Shader Model 3.0 be a must in a few years? 2) Should I make sure my card supports it? 3) Will I have programming limitations if I don't? The only thing I know of Shader Model 3.0 is that it supports a theorically infinite number of shader code. Thanks for your future replies.

Share this post


Link to post
Share on other sites
Shader Model 3.0 card is definately what you should go after, if you plan to develop high end graphics. Both Xbox360 and PS3 are also using SM3.0+ technology, so if you plan to ever apply to the game industry, it's good knowledge to have.

Share this post


Link to post
Share on other sites
Quote:
Original post by Trillian
1) Will having Shader Model 3.0 be a must in a few years?

In a few years? Definitely

Quote:

2) Should I make sure my card supports it?

Definitely.
Quote:

3) Will I have programming limitations if I don't?

Definitely. That is, if you want to write shaders. [wink]

Quote:

The only thing I know of Shader Model 3.0 is that it supports a theorically infinite number of shader code.

There are a couple other things. One is that it supports dynamic branching (if-statements). That in itself can be a pretty significant thing. For example, you can use it to combine mulitple shaders into one, and just decide at runtime which to execute. Saves you a handful of state changes. Another is geometry instancing. If you're rendering lots of units from the same mesh, you can just supply new instance data (a new transformation matrix, for example, and maybe a few other shader-specific data), and reuse the actual vertex data, allowing you to draw them all with a single DrawPrimitive call.

Share this post


Link to post
Share on other sites
Definately. Theres no reason not to since SM 3.0 capable cards are available as cheap as they are. On the low end, a GeForce 6200 can be had for 50-60 bucks, at the mid-range, a 6600 GT can be had for ~100, at the high end 6800 GTs can be had for ~200 as well as some cards from ATI, further still are the super-high-end 7800 and X1000 cards for ~350 to 500 bucks.

The sweet spot, IMO, is the higher end 6800s, particularly the new variant that just came out, though I forget the name. It perfoms equally to the 6800 ultra, but costs around ~200. Definately best bang for the buck.

Share this post


Link to post
Share on other sites
The big thing about SM3 is flow control (branching, if's, loop's, etc) and the new ATi cards are extremely efficient at it.



They pretty much double in speed running with flow control over running without. NVIDIA cards on the other hand take a slight hit (in the above test).

My suggestion would be for the X1 series from ATi.

Share this post


Link to post
Share on other sites
Quote:
Original post by Trillian
Hello!

I have been programming in Direct3D for about 2 or 3 months and I am progressing at a rate I consider quite incredible. But soon, the limitations of my ATI Radeon 9200 will be evident and that will slow my progression. So I'm planning on getting a new graphics card for Christmas.

Currently, I'm not using Vertex of Pixel Shaders, but I'm quite sure that I'll start using them in less than... lets say 4 months. I had had some bugs of unsupported Shader Model version with my current card and I desire not to have such trouble with my new card.

1) Will having Shader Model 3.0 be a must in a few years?
2) Should I make sure my card supports it?
3) Will I have programming limitations if I don't?

The only thing I know of Shader Model 3.0 is that it supports a theorically infinite number of shader code.

Thanks for your future replies.


I would buy an inexpensive NVidia part for christmas, something like a 6600. It has a better feature set then ATI's card (mainly, it can do texture loads in the VertexShadeR). Don't spend too much, you don't need the perf it if you are just playing around...

I wouldnt' spend alot on one though, DX10 parts are just around the corner ;)


Share this post


Link to post
Share on other sites
Yeah definitely get a SM3.0 card.

Which one is a hard call... there are features that I like on both:

NVIDIA:
+ floating point texture filtering
+ vertex texture reads
- medium to slow dynamic flow control
- vertex texture reads slow as dirt ;)
- no anti-aliasing on floating point render targets

ATI:
+ multisampled antialiasing on floating point render targets
+ render to vertex buffer
+ super-fast dynamic flow control
- no vertex texture read
- no floating point texture filtering

I don't think you can really go wrong with either though, as you probably won't hit those walls soon... at least not before the DX10 cards are out.

Share this post


Link to post
Share on other sites
Quote:
Original post by rollo
SM3 also means allows you to do texture sampling in the vertex shader which can be useful for displacement mapping and other things.

*cough* Not quite...gotta love loopholes in the specs

Quote:
I wouldnt' spend alot on one though, DX10 parts are just around the corner ;)

Exactly. Just be prepared to upgrade again to a brand new card if you want to use DX10 when it comes out. Personally, I was planning on upgrading to a new SM3 card when the new ATI line came out, but the lack of vertex texturing made me decide not to.

Share this post


Link to post
Share on other sites
Quote:
Original post by circlesoft
Quote:
Original post by rollo
SM3 also means allows you to do texture sampling in the vertex shader which can be useful for displacement mapping and other things.

*cough* Not quite...gotta love loopholes in the specs


Shame on you ATI!

Share this post


Link to post
Share on other sites
Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...

Share this post


Link to post
Share on other sites
Quote:

*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?

Share this post


Link to post
Share on other sites
Quote:
Original post by lancekt
Quote:

*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?

Here is a thread where we talked about it a little bit. What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).

Share this post


Link to post
Share on other sites
Quote:
Original post by Moe
Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...

Not until Windows Vista appears which is pretty unlikely to be earlier than October next year. I'd guess the earliest we'll see any DX10 hardware is in time for Christmas 2006.

Share this post


Link to post
Share on other sites
Quote:
Original post by circlesoft
Quote:
Original post by lancekt
Quote:

*cough* Not quite...gotta love loopholes in the specs


Care to elaborate?

Edit: Specifically, have a reference for lack of vertex texturing on the new ATI stuff?

Here is a thread where we talked about it a little bit. What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).


Correct me if I am wrong but is the current NVIDIA implementation anything more than just a checkmark on a features list? From what I've heard VTF on NVIDIA cards is almost impractical to use because of the slow speed and its pretty limited too (only point filtered? special texture formats?).

AFAIK Only one game supports it (IL2 for water), by default it is disabled and enabling it causes a massive drop in framerate (50+ %).

I agree that it would be nice to have but given the limitations of NVIDIAs implementation, ATi's alternative R2VB is looking a lot better (especially since it can be supported back into the R300s).

Share this post


Link to post
Share on other sites
I wouldn't say they are completely impractical: Terrain rendering using GPU-based geometry clipmaps. They managed to speed up their algorithm quite a bit using vertex texture fetches, although they found them to be the bottleneck in their implementation.

I can't really say which has the best idea, nvidia or ati, and it seems to boild down to which of these you prefer:
- have similar features in vertex shader and fragment shader (nvidias VTF)
- keep vertex shaders simple and leverage the fact that there are usually more fragment processing units than vertex units on the card and go the ATI way.

Share this post


Link to post
Share on other sites
Quote:
Original post by mattnewport
Quote:
Original post by Moe
Any idea when DX10 compatible cards will be available? I am thinking that my ATi X300 is beginning to seem a little slow...

Not until Windows Vista appears which is pretty unlikely to be earlier than October next year. I'd guess the earliest we'll see any DX10 hardware is in time for Christmas 2006.


Hardware may appear before Vista. Of course, you will have to upgrade OSs to use the DX10 features.

Share this post


Link to post
Share on other sites
Quote:
Original post by circlesoft
What sucks is that ATI has made it quite hard for devs to support SM3, due to the fact that pretty much everyone wants to use vertex texturing. Now, you not only have separate paths for SM1, SM2, and SM3, but also SM3 w/ vertex texturing (Nvidia) and SM3 w/ stream output (ATI).

TBH, if I could only have one, I'd rather have render to vertex buffer than vertex texture read... the former is a superset of the latter's functionality, and even with the additonal pass, it's still going to be way faster just because of the way that current GPUs are designed (once they all have unified shaders, filtered reads in the vertex shader will be just as fast as the fragment shader).

That said I agree that having to write a different code path is really annoying.

Still, the lack of fp16 filtering is my biggest complaint about the new ATI cards. I don't mind if they do it in shader code internally (better to have programmable hardware than static interpolators that can get wasted!), but they NEED to do that for the developers at some point. Sure bilinear is easy to write oneself, but aniso and mipmapping? Possible, but annoying...

Share this post


Link to post
Share on other sites
Quote:


Correct me if I am wrong but is the current NVIDIA implementation anything more than just a checkmark on a features list? From what I've heard VTF on NVIDIA cards is almost impractical to use because of the slow speed and its pretty limited too (only point filtered? special texture formats?).



If you are a game player, then yes its of limited value. But as A DEVELOPER it is very interesting to prototype with new features, becuase you can tolerate less then stellar performance. There are some big things you can do with the texture read. you can do thing slike Morph target blending (industry standard way of doing high quality animation).

Additionally, I beleive it is much faster in the 7800 (though I haven't tested it personally).


Share this post


Link to post
Share on other sites
Quote:
Original post by EvilDecl81
Hardware may appear before Vista. Of course, you will have to upgrade OSs to use the DX10 features.

It might, but I suspect it won't. Given the major differences in the driver model and feature set between DX9 and DX10 I'd guess the DX10 cards won't get drivers for XP. Could be wrong though.

Share this post


Link to post
Share on other sites
Quote:
Original post by mattnewport
Quote:
Original post by EvilDecl81
Hardware may appear before Vista. Of course, you will have to upgrade OSs to use the DX10 features.

It might, but I suspect it won't. Given the major differences in the driver model and feature set between DX9 and DX10 I'd guess the DX10 cards won't get drivers for XP. Could be wrong though.


The Dx10 parts still need to run dx9 games, so they are still perfectly valid dx9 parts with a potentially completly seperate dx9 driver.

Share this post


Link to post
Share on other sites
Quote:
Original post by EvilDecl81
The Dx10 parts still need to run dx9 games, so they are still perfectly valid dx9 parts with a potentially completly seperate dx9 driver.

Yes, DX10 functionality is a superset of DX9 but the driver model is still totally different. The question is whether nVIDIA and ATI will bother writing XP drivers for the new hardware when none of the new features will be available on XP. You'll still get improved performance so it will probably come down to a matter of how many sales they think they'll get to people with XP and whether the extra sales are worth the engineering effort. I don't know what they'll end up doing.

Share this post


Link to post
Share on other sites
Quote:
Original post by mattnewport
Quote:
Original post by EvilDecl81
The Dx10 parts still need to run dx9 games, so they are still perfectly valid dx9 parts with a potentially completly seperate dx9 driver.

Yes, DX10 functionality is a superset of DX9 but the driver model is still totally different. The question is whether nVIDIA and ATI will bother writing XP drivers for the new hardware when none of the new features will be available on XP. You'll still get improved performance so it will probably come down to a matter of how many sales they think they'll get to people with XP and whether the extra sales are worth the engineering effort. I don't know what they'll end up doing.


To answer your question is, yes. For years to come.

Share this post


Link to post
Share on other sites

This topic is 4393 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this