#### Archived

This topic is now archived and is closed to further replies.

# Should I support 3DFX path in my DX8.1 renderer ?

This topic is 5520 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I have noticed it just today that there are new drivers for 3DFX cards for DirectX 8.1 here . While I havent actually tried them to see if they work at all, I assume they might. So should I make a separate render path for my Avenger game now that my game is playable at 266 MHz Celeron ? I remember that 3DFX cards used to be on HW around 200 MHZ CPUS, right ? Did anybody here went through this already ? Is it worth the effort ? Were there some nasty problems with DirectX when some DX feature should work but acturally didnt ? Of course Im just talking about basic features like multitexturing and alpha blending&testing, no Shader stuff. And how much 3DFX cards are actually out there ? Im strictly looking at it from the potential-market point of view, i.e. if there are currently just few active users, no point in spending week or two (or even more, who knows) making it run on already obsolete Hardware. What do you think ? VladR Avenger game

##### Share on other sites
well, people still have these cards, so you''d be expanding your user base a bit. If you''re under a tight schedule or something... then maybe it''s not worth it. However, if you have the time, then go ahead and make it run on as wide a variety of cards as possible.

##### Share on other sites
quote:
well, people still have these cards, so you''d be expanding your user base a bit
If only I knew how much that "a bit" actually is...

Well there is no strict schedule since it is a free-time-after-daily-work project, so we are trying to make it work, not just decide which features make it or not into final build depending on time available. Generally it would be experience, but since this HW is dead, the time might be spent better with Shader stuff for example.
But if there is still big active community, it might be worth the try. Im still googling trying to find some info.

The best would be to have HW user-base-config data from publisher - they have pretty accurate data. But it isnt public info, is it ?

Avenger game

##### Share on other sites
I''m not sure what the numbers are, but I''ll put it this way... I play computer games a lot and I just recently replaced my Voodoo with a new card, so I would suspect that there are still a lot of gamers who use these regularly. Afterall, these used to be the most popular cards out there.

however, if you want to put all kinds of crazy gfx fx in your game and target it towards people with the most current systems, then maybe your time is better spent on that.

Who are you targeting this game for?

##### Share on other sites
quote:
however, if you want to put all kinds of crazy gfx fx in your game and target it towards people with the most current systems, then maybe your time is better spent on that.
Take a look at screenshots of the game and see for yourself, but its no fancy DOOm3/HL2 gfx fx - it is my first engine, but I think it is acceptable gfx today - what do you think ?

quote:
Who are you targeting this game for?
Casual players not liking blood games, those that like logical games and also users with now-the-slow-HW like 266 MHz pcs, (it is the slowest of 3 pcs that i have at home), nVidia TNT1 cards and such.

Avenger game

##### Share on other sites
I''ll make the decision easy for you. DON''T DO IT. It''s hard enough just finishing something in your free time, much less having to worry about supporting graphics cards from 5 years ago.

##### Share on other sites
"Casual players not liking blood games, those that like logical games and also users with now-the-slow-HW like 266 MHz pcs, (it is the slowest of 3 pcs that i have at home), nVidia TNT1 cards and such."

ok, in this case, you might want to support it... however, I'd do it when the game is practically done.

[edited by - Drythe on May 29, 2003 5:19:32 AM]

##### Share on other sites
quote:
however, I''d do it when the game is practically done
Well, more suitable would be to do it now before I declare render function as finished. If I just programmed it separately after the game has been tested, I might have to start testing again just because of 3DFX. And I think it shall be pretty hard to find 3dfx betatesters. Ive been browsing through several 3dfx forums, but it seems pretty dead at present . Or there are no problems with running dx 8.1 games on 3dfx HW.
quote:
and I just recently replaced my Voodoo with a new card
What is your CPU and what current games were playable ? It might be interesting to know the scalability of Voodoo3 (is this what you had ?) on current crop of CPUs.

Avenger game

##### Share on other sites
Mainly assuming Voodoo2 and TNT1/2 level stuff here

1) The market:

a. The Voodoo2 in particular was almost exclusively bought by "gamers".

b. "gamers" are the kind of people who keep their machines reasonably up to date whenever they can afford to. "hardcore gamers" are the ones with the latest stuff (2GHz+ CPUs, shader capable cards etc).

c. Most of the people who had V2s will now be at about Matrox G400 or even GeForce256 level hardware now.

d. Some of the people who still have V2s can''t afford newer hardware (though you can get a GeForce4 or Radeon7000 for less than the price of a single PC game at the moment).

e. Some of the people who still have V2s are using second hand machines and/or second hand V2s (handed down from family/friends who were upgrading).

f. Many non-gamers will have seriously under-spec machines, though you''re less likely to find Voodoo cards in their machines because they''re the machines that were advertised and built as home office/student machines and were cheaper than the machines built as "gamer" machines. This is the kind of machine your parents are likely to own. If you find anything more than a plain VGA card in one of these (i.e. slightly newer), then it''ll be likely an nVidia Riva 128, nVidia TNT or S3 ViRGE based thing.

g. ISTR reading a statistic that the most common graphics chip out there is an nVidia TNT.

h. Laptops are the big killer - although there''s been decent laptop graphics chips available for quite a few years, some pretty recent (i.e. past 3 years) machines have been manufactured using old graphics hardware (between S3 ViRGE and TNT level stuff).

2) Technical issues you might come up against

a. V2 and TNT level chips were the first generation to support single pass multitexturing. So in terms of single texture operations they''re actually fairly good - but not so amazing by todays standards.

b. If you think modern drivers have some bad bugs, some of the ones back then were truly hideous meaning that on some old Voodoo drivers in particular, many device caps are almost meaningless.

c. As mentioned above, TNT and V2 are multitexture. They''ll do 2 textures in a single pass. However they do severely restrict what you can do in the second stage. Basically you''re limited to: lightmapping, additive envmapping and emboss bump mapping. Anything else just won''t work. Don''t expect the more complex blends (DOT3, BLEND etc) to work in either stage too.

d. Textures on the V2 are limited to 16bit only, must be square, must be power of 2 and have a maximum size of 256x256. 1555 and 4444 are supported, and (memory fading) I think a few unusual and 8bit formats (luminance only, maybe even RGB332).

e. TNT could do 16 and 32bit textures, dimensions limited to 1024x1024 (or was that TNT2?), they had to be power-of-2, and I think square was preferred.

f. Multitexturing on V2 split each D3D texture stage into a dedicated TMU (Texture Memory Unit [I think]), this allowed the SLI configuration (Scan Line Interleaving - 2 V2''s connected together for more power). This however did introduce a restriction specific to Voodoos - when you load a texture, it has to be "bound" to a specific TMU (TMU1 or TMU2) at load time. That texture can then ONLY be used with the TMU it was bound to.
For example if you had "prettygrass.tga" loaded and created as a D3D texture for TMU1 and wanted to use that texture again in TMU2, you''d have to create a copy of that texture (i.e. load it twice).

g. The V2 had its texture memory separate from its frame buffer memory, and IIRC the amount the drivers reported for video memory was actually framebuffer + texture but it wasn''t always usable (can''t remember exactly - too long ago ;o)

h. Frame buffer blends are limited on the older chips, expect ONE, ZERO, SRCALPHA, INVSRCALPHA, SRCCOLOR, and not much else. Some are limited even more than that, for example PowerVR PCX1 and PCX2 can''t do ONE:ONE (which is why when all the games only tested on V2s, PVR owners weren''t happy when they got black outlines round their glowy stuff).

i. Don''t expect render to texture on V2 (see point g), and it''ll be flaky on most old chips anyway. Search Gamasutra for "Kim Pallister" (Intel DRG), he did an article on the support for render targets and workarounds etc.

j. Likewise, don''t expect most modern pixel processing features to be supported (clip planes etc). For vertex processing you''ll be using software so you can do all of that just fine (texgen for envmapping etc) CPU speed permitting of course.

k. Remember that old hardware is most likely to be in old machines and you''ll be using software vertex processing. Combined with the fillrate limits of old stuff means you''ll have to be much more aggressive about your poly counts (i.e. stay at say 10000 per frame and keep everything else under control if you want to maintain 60Hz).

If you''re planning on supporting old hardware there really isn''t any substitute for getting the real hardware and doing your own live tests. Having just the caps for example isn''t enough because most states etc are "combinatorical" - i.e. whether they''re supported depends entirely on the setting of other states (e.g. "you can do this, but ONLY if filtering is set to bilinear and FSAA is disabled").
The programmatic way to find whether those combinations are ok on a particular chip is to combine the caps with ValidateDevice() calls (assuming decent drivers...).
The easier and only other way is to actually test everything and tinker.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

##### Share on other sites
Wow, pretty massive reply S1CA , thanks

RE: The Market:
a)-e) : yes, thats what Im worrying about too. That those that bought 3dfx (at the time), wont have it by now in their main rig.
f) : I cant make it run on whole universe of gfx cards. nVidia, ATI and 3dfx seem to me like a pretty big part of gfx market. I know that there is Matrox, but one Matrox user havent noticed any problems, so I think, its ok with Matrox. Those with underspec - hm well, what can i do about it ? Just skip
g) : TNT1 ? Good to know.
h) Does anybody adress their games to laptops if theyre full 3D ?

RE: technical issues:
a) - j) : Well, some of the stuff you mention is pretty scary. Did you have to experience it all by yourself ? Especialy those "combinatorical issues" ?
Luckily Ive restricted all textures to be square and to power-of-two dimensions. Low detail textures are all 128x128, so no problem with that at least.Blending may be of bigger problem, but it can be solved simply by disabling it and thats it.

k)Ehm, software vertex processing, you mean by that transforming of vertices only being accelerated by T&L units on GF2 gfx level ?

quote:
If you''re planning on supporting old hardware there really isn''t any substitute for getting the real hardware and doing your own live tests.

I did that for most nVidia cards, but cant seem to find any 3dfx user in my enviroment. Hopefully I`ll find one soon...

S1CA , Thank You very much for your info, you helped me really much, thanks

Avenger game

1. 1
2. 2
Rutin
19
3. 3
JoeJ
16
4. 4
5. 5

• 30
• 22
• 13
• 13
• 17
• ### Forum Statistics

• Total Topics
631700
• Total Posts
3001800
×