How long would you support Shader Model 2?

Started by
15 comments, last by cozzie 10 years, 2 months ago

Hi,

I was thinking about this and my conclusions so far are:

- although supporting SM2 backwards compatibility is nice practice, I've never ever had to use it before

(other then testing if backwards compatibility/ rendering with SM2 works OK in my engine/ with my shaders)

- I believe SM3 is basically standard on all GPU's since around 2006

- currently backwards supporting SM2 brings extra work, less features, extra maintenance when doing changes etc.

What do you think?

Crealysm game & engine development: http://www.crealysm.com

Looking for a passionate, disciplined and structured producer? PM me

Advertisement

Depends entirely on your target platforms and audience. If the Steam survey is near to accurate then SM3.0+ is the vast majority for people that regularly use Steam anyway, and so I don't see a reason not to jump to it as a minimum. Didn't seem to slow down BF3 or Just Cause 2 sales.

- although supporting SM2 backwards compatibility is nice practice, I've never ever had to use it before


Practice for... what?
Many new games (like BF3) have been requiring SM4+ for a while.

SM2 support is nice if you want to sell your game in India, China, etc, where there actually is a huge market of old PCs.

I've been working on a game for the Indian market that requires SM3+, and there's a huge number of fans who are using SM3-CPU-emulators in order to run the game!!

Thanks.

@Phantom: practice in learning HLSL and developing my hobby 3d engine as flexible as possible, mostly data driven :)

Looking at the figures and my target audience, I'm gonna just skip SM2 support.

Crealysm game & engine development: http://www.crealysm.com

Looking for a passionate, disciplined and structured producer? PM me

there's very few stuff done only in SM4, rewriting a simple frag/vert shader both for SM2/SM3 is question of minutes (most time) and don't require any changes to the C++ part. I know GPU vendors will sell more GPUs if games stop supporting old shader models, but actually there's no maintenance burden for me in supporting them (well If someone start having thousands line of shaders then maybe it is not a good idea to support SM2 anymore XD).

this also depends on your framework and assets pipeline, I don't know how many people requires effort also on C++ side to support certain stuff.

Peace and love, now I understand really what it means! Guardian Angels exist! Thanks!

SM2 support is nice if you want to sell your game in India, China, etc, where there actually is a huge market of old PCs.
I've been working on a game for the Indian market that requires SM3+, and there's a huge number of fans who are using SM3-CPU-emulators in order to run the game!!

Are those actual customers who pay for the game, too?

I always struggle to understand when companies target China because it's kind of well-known that they're only willing to pay a fraction of what everybody else is paying, if they don't pirate right away. Development cost and maintenance is the same, however. Insofar I always wonder how this math can work out (but apparently it does).

Now India isn't China, so that might again be different, of course...

EDIT:

About SM2, I would drop it without wasting a thought. I would rather consider whether or not to drop SM3 support. Given SM4, you are deep in the comfort zone, you have everything that you reasonably need, and it's available without weird twists and quirks, and the guaranteed minimums are workable (under SM3 the guaranteed minimums are ridiculous). You really don't need SM5 (nice to have, but who cares otherwise).

Basically, for SM2 you pretty much need to rethink and reimplement everything for a quite low quality result in return, and you already need to rethink and write special paths and workarounds for half of the stuff in SM3. A card that only supports SM2 not only isn't able to do certain shader tricks, it likely can't cope with your triangle count either (so you need to LOD much more aggressively, and likely create special low-poly models in addition so it looks reasonably good).

If nothing else, in SM4 you have guaranteed working MRT support ("working" as in "working on every vendor, and no bogus stuff like multi = 1") and guaranteed vertex texturing (again, no "bogus support"), and guaranteed, working float textures and hardware sRGB. Plus, no obscure limits that you are likely to run into if you do somewhat reasonable stuff.

The fact that you also get geometry shaders and transform feedback is really only sugar on top.

On the other hand, you get entry-level SM5 cards for around 25-30 currency now, so if you expect your customers to pay 25-30 currency for your game, there is really not much of an excuse why they couldn't also have at least a SM4 card (which was already available for the same price years ago).

well using directly SM4 makes programmer life very easy, this is true.

PC stats:

stats from unity webplayer: http://stats.unity3d.com/web/gpu.html

stats from wikipedia: http://en.wikipedia.org/wiki/Usage_share_of_operating_systems

14% of people have SM2

10% of people have SM3

7% of people have OS X

Purely looking at those stats, makes more sense supporting SM2 than SM3 or OS X.

If you are going to make a mobile game you probably want to use some tool/SDK that target mobile instead of making your own native port, in that case you do most stuff in SM2 anyway because mobiles are a topic apart.

A lot of people with expensive hardware just play top AAA next-gen games and don't even bother with indies, so as far as there's a good portion of audience with SM2 I'll probably stick to that. If you start making tricks of sort to get that graphics effect because you are more focused on graphics than on gameplay then it is good idea you just start with SM4 and stop having troubles. (there's still a lot amazing stuff that can just be done on SM2, and by the way I'm more a shader writer than a game maker, don't misunderstand me I love SM4.)

I don't support something only if there are good reasons to do so (performance hit, maintenance hell, personal skill limits).

Most people is not able to change their videocard (even if cheap), and anyway laptops are less easy to customize, so I don't rely on the fact that "people will probaly spend 30 dollars to buy a cheap card to play my 10 dollars game", it is more likely that people "spend 100 dollars to buy a decent card to play a recently released AAA title".

Peace and love, now I understand really what it means! Guardian Angels exist! Thanks!

Are those actual customers who pay for the game, too?

I always struggle to understand when companies target China because it's kind of well-known that they're only willing to pay a fraction of what everybody else is paying, if they don't pirate right away. Development cost and maintenance is the same, however. Insofar I always wonder how this math can work out (but apparently it does).

You're allowing prejudice to blind you to reality.

China's games industry revenue last year was $13.8 billion (38% YoY growth). US 2012 was $14.8 billion and shrinking. I haven't seen 2013 US numbers yet, but it should be a growth year (not 38% of course). In any case, China spends almost as much as US on games. They'll spend more than us on games within a year or two, especially with bans being lifted. They also spend more on lower-end games than US since they have lower-end hardware. That's a really good thing for indie devs.

That said, it's probably not worth targeting SM2. You'd probably launch in US before localizing to other areas, and then only localize if you made money. If you make enough money, you don't worry about adding SM2 support if you deem it necessary. When developing, try to delay as much complexity as possible until after you make money. If you make money, people will tell you to add SM2 support if they want it anyways.

Are those actual customers who pay for the game, too?

I'm not sure. There's only a small/restricted free demo of the game out so far, which is actually usable at low framerates... however when the actual game is released, I don't think it will be really playable via these emulators, so all these non-SM3 customers will be lost either way.

Then yes, you've got to make the assumption that if they can't afford to upgrade from an old SM2 PC, then they probably can't afford very much software either...

However, I'm in Australia, which is over 50x smaller than India. So, even if piracy rates there are 50x times greater, then overall revenues would remain the same for each market.
Even if you assume that there will be 10x more piracy, and you also drop the price by 10x, there's still 50x more customers, so it's still a huge amount of extra revenue on top of our domestic market.

That said, I've not yet approached this company and asked them if they'd like me to port their game from SM3 to SM2... Maybe I should do that cool.png

there's very few stuff done only in SM4, rewriting a simple frag/vert shader both for SM2/SM3 is question of minutes (most time) and don't require any changes to the C++ part.

That's true only if you assume that you don't require any changes to the C++ part.
If the SM4 version uses stream-out, then porting to SM3 will require completely changing the algorithms used in the C++ part.
If the SM3 version uses VTF or MRT, then porting to SM2 will require completely changing the algorithms too.
As mentioned by others, the main difference between SM2/3/4/5 GPUs is in the feature sets that are supported.

SM2 is actually pretty nice because you know that all the fancy features aren't supported, so you can only write simple shaders biggrin.png
SM4 is nice too, because the D3D10 specification for SM4 cards is very strict about all standard features being supported.
SM3 is really annoying, because most of it's features are completely optional. You can even access some SM4-level features via driver hacks, but only on some cards.

This topic is closed to new replies.

Advertisement