Archived

This topic is now archived and is closed to further replies.

Should I support 3DFX path in my DX8.1 renderer ?

This topic is 5306 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I have noticed it just today that there are new drivers for 3DFX cards for DirectX 8.1 here . While I haven`t actually tried them to see if they work at all, I assume they might. So should I make a separate render path for my Avenger game now that my game is playable at 266 MHz Celeron ? I remember that 3DFX cards used to be on HW around 200 MHZ CPUS, right ? Did anybody here went through this already ? Is it worth the effort ? Were there some nasty problems with DirectX when some DX feature should work but acturally didn`t ? Of course I`m just talking about basic features like multitexturing and alpha blending&testing, no Shader stuff. And how much 3DFX cards are actually out there ? I`m strictly looking at it from the potential-market point of view, i.e. if there are currently just few active users, no point in spending week or two (or even more, who knows) making it run on already obsolete Hardware. What do you think ? VladR Avenger game

Share this post


Link to post
Share on other sites
well, people still have these cards, so you''d be expanding your user base a bit. If you''re under a tight schedule or something... then maybe it''s not worth it. However, if you have the time, then go ahead and make it run on as wide a variety of cards as possible.

Share this post


Link to post
Share on other sites
quote:
well, people still have these cards, so you''d be expanding your user base a bit
If only I knew how much that "a bit" actually is...

Well there is no strict schedule since it is a free-time-after-daily-work project, so we are trying to make it work, not just decide which features make it or not into final build depending on time available. Generally it would be experience, but since this HW is dead, the time might be spent better with Shader stuff for example.
But if there is still big active community, it might be worth the try. I`m still googling trying to find some info.

The best would be to have HW user-base-config data from publisher - they have pretty accurate data. But it isn`t public info, is it ?



VladR
Avenger game

Share this post


Link to post
Share on other sites
I''m not sure what the numbers are, but I''ll put it this way... I play computer games a lot and I just recently replaced my Voodoo with a new card, so I would suspect that there are still a lot of gamers who use these regularly. Afterall, these used to be the most popular cards out there.

however, if you want to put all kinds of crazy gfx fx in your game and target it towards people with the most current systems, then maybe your time is better spent on that.

Who are you targeting this game for?

Share this post


Link to post
Share on other sites
quote:
however, if you want to put all kinds of crazy gfx fx in your game and target it towards people with the most current systems, then maybe your time is better spent on that.
Take a look at screenshots of the game and see for yourself, but it`s no fancy DOOm3/HL2 gfx fx - it is my first engine, but I think it is acceptable gfx today - what do you think ?


quote:
Who are you targeting this game for?
Casual players not liking blood games, those that like logical games and also users with now-the-slow-HW like 266 MHz pcs, (it is the slowest of 3 pcs that i have at home), nVidia TNT1 cards and such.



VladR
Avenger game

Share this post


Link to post
Share on other sites
I''ll make the decision easy for you. DON''T DO IT. It''s hard enough just finishing something in your free time, much less having to worry about supporting graphics cards from 5 years ago.

Share this post


Link to post
Share on other sites
"Casual players not liking blood games, those that like logical games and also users with now-the-slow-HW like 266 MHz pcs, (it is the slowest of 3 pcs that i have at home), nVidia TNT1 cards and such."

ok, in this case, you might want to support it... however, I'd do it when the game is practically done.





[edited by - Drythe on May 29, 2003 5:19:32 AM]

Share this post


Link to post
Share on other sites
quote:
however, I''d do it when the game is practically done
Well, more suitable would be to do it now before I declare render function as finished. If I just programmed it separately after the game has been tested, I might have to start testing again just because of 3DFX. And I think it shall be pretty hard to find 3dfx betatesters. I`ve been browsing through several 3dfx forums, but it seems pretty dead at present . Or there are no problems with running dx 8.1 games on 3dfx HW.
quote:
and I just recently replaced my Voodoo with a new card
What is your CPU and what current games were playable ? It might be interesting to know the scalability of Voodoo3 (is this what you had ?) on current crop of CPUs.

VladR
Avenger game

Share this post


Link to post
Share on other sites
Mainly assuming Voodoo2 and TNT1/2 level stuff here

1) The market:

a. The Voodoo2 in particular was almost exclusively bought by "gamers".

b. "gamers" are the kind of people who keep their machines reasonably up to date whenever they can afford to. "hardcore gamers" are the ones with the latest stuff (2GHz+ CPUs, shader capable cards etc).

c. Most of the people who had V2s will now be at about Matrox G400 or even GeForce256 level hardware now.

d. Some of the people who still have V2s can''t afford newer hardware (though you can get a GeForce4 or Radeon7000 for less than the price of a single PC game at the moment).

e. Some of the people who still have V2s are using second hand machines and/or second hand V2s (handed down from family/friends who were upgrading).

f. Many non-gamers will have seriously under-spec machines, though you''re less likely to find Voodoo cards in their machines because they''re the machines that were advertised and built as home office/student machines and were cheaper than the machines built as "gamer" machines. This is the kind of machine your parents are likely to own. If you find anything more than a plain VGA card in one of these (i.e. slightly newer), then it''ll be likely an nVidia Riva 128, nVidia TNT or S3 ViRGE based thing.

g. ISTR reading a statistic that the most common graphics chip out there is an nVidia TNT.

h. Laptops are the big killer - although there''s been decent laptop graphics chips available for quite a few years, some pretty recent (i.e. past 3 years) machines have been manufactured using old graphics hardware (between S3 ViRGE and TNT level stuff).



2) Technical issues you might come up against

a. V2 and TNT level chips were the first generation to support single pass multitexturing. So in terms of single texture operations they''re actually fairly good - but not so amazing by todays standards.

b. If you think modern drivers have some bad bugs, some of the ones back then were truly hideous meaning that on some old Voodoo drivers in particular, many device caps are almost meaningless.

c. As mentioned above, TNT and V2 are multitexture. They''ll do 2 textures in a single pass. However they do severely restrict what you can do in the second stage. Basically you''re limited to: lightmapping, additive envmapping and emboss bump mapping. Anything else just won''t work. Don''t expect the more complex blends (DOT3, BLEND etc) to work in either stage too.

d. Textures on the V2 are limited to 16bit only, must be square, must be power of 2 and have a maximum size of 256x256. 1555 and 4444 are supported, and (memory fading) I think a few unusual and 8bit formats (luminance only, maybe even RGB332).

e. TNT could do 16 and 32bit textures, dimensions limited to 1024x1024 (or was that TNT2?), they had to be power-of-2, and I think square was preferred.

f. Multitexturing on V2 split each D3D texture stage into a dedicated TMU (Texture Memory Unit [I think]), this allowed the SLI configuration (Scan Line Interleaving - 2 V2''s connected together for more power). This however did introduce a restriction specific to Voodoos - when you load a texture, it has to be "bound" to a specific TMU (TMU1 or TMU2) at load time. That texture can then ONLY be used with the TMU it was bound to.
For example if you had "prettygrass.tga" loaded and created as a D3D texture for TMU1 and wanted to use that texture again in TMU2, you''d have to create a copy of that texture (i.e. load it twice).

g. The V2 had its texture memory separate from its frame buffer memory, and IIRC the amount the drivers reported for video memory was actually framebuffer + texture but it wasn''t always usable (can''t remember exactly - too long ago ;o)

h. Frame buffer blends are limited on the older chips, expect ONE, ZERO, SRCALPHA, INVSRCALPHA, SRCCOLOR, and not much else. Some are limited even more than that, for example PowerVR PCX1 and PCX2 can''t do ONE:ONE (which is why when all the games only tested on V2s, PVR owners weren''t happy when they got black outlines round their glowy stuff).

i. Don''t expect render to texture on V2 (see point g), and it''ll be flaky on most old chips anyway. Search Gamasutra for "Kim Pallister" (Intel DRG), he did an article on the support for render targets and workarounds etc.

j. Likewise, don''t expect most modern pixel processing features to be supported (clip planes etc). For vertex processing you''ll be using software so you can do all of that just fine (texgen for envmapping etc) CPU speed permitting of course.

k. Remember that old hardware is most likely to be in old machines and you''ll be using software vertex processing. Combined with the fillrate limits of old stuff means you''ll have to be much more aggressive about your poly counts (i.e. stay at say 10000 per frame and keep everything else under control if you want to maintain 60Hz).

If you''re planning on supporting old hardware there really isn''t any substitute for getting the real hardware and doing your own live tests. Having just the caps for example isn''t enough because most states etc are "combinatorical" - i.e. whether they''re supported depends entirely on the setting of other states (e.g. "you can do this, but ONLY if filtering is set to bilinear and FSAA is disabled").
The programmatic way to find whether those combinations are ok on a particular chip is to combine the caps with ValidateDevice() calls (assuming decent drivers...).
The easier and only other way is to actually test everything and tinker.

--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
Wow, pretty massive reply S1CA , thanks

RE: The Market:
a)-e) : yes, that`s what I`m worrying about too. That those that bought 3dfx (at the time), won`t have it by now in their main rig.
f) : I can`t make it run on whole universe of gfx cards. nVidia, ATI and 3dfx seem to me like a pretty big part of gfx market. I know that there is Matrox, but one Matrox user haven`t noticed any problems, so I think, it`s ok with Matrox. Those with underspec - hm well, what can i do about it ? Just skip
g) : TNT1 ? Good to know.
h) Does anybody adress their games to laptops if they`re full 3D ?


RE: technical issues:
a) - j) : Well, some of the stuff you mention is pretty scary. Did you have to experience it all by yourself ? Especialy those "combinatorical issues" ?
Luckily I`ve restricted all textures to be square and to power-of-two dimensions. Low detail textures are all 128x128, so no problem with that at least.Blending may be of bigger problem, but it can be solved simply by disabling it and that`s it.

k)Ehm, software vertex processing, you mean by that transforming of vertices only being accelerated by T&L units on GF2 gfx level ?

quote:
If you''re planning on supporting old hardware there really isn''t any substitute for getting the real hardware and doing your own live tests.

I did that for most nVidia cards, but can`t seem to find any 3dfx user in my enviroment. Hopefully I`ll find one soon...

S1CA , Thank You very much for your info, you helped me really much, thanks


VladR
Avenger game

Share this post


Link to post
Share on other sites
1f) If you''re using DirectX 8 or above, the cut off will be whether there are DX6 drivers available for that hardware. Voodoo cards only just about get away with things because fans have produced partially updated drivers. Almost all really old chips won''t have had their drivers updated.

If all you''re doing is textured triangles, then practically anything will do those. About the only chip I can think of that has hardware accelerated 3D but can''t do textured triangles was the Matrox Millenium 1. Matrox Mystique can do texturing, but can''t do alpha (it can do stippled alpha).

Even some multitexture operations and things like Gouraud shading aren''t quite perfect on really ancient chips. Some really old ATI chips can''t modulate alpha. Old PowerVR PCX and PCX 2 chips do gouraud shading, but only interpolate based on the luminance rather than true 3 channel shading.

Assuming there were drivers available I''d handle the really ancient chips as a special case that turned EVERYTHING apart from textured polygons off. With maybe some user settings to try and enable things. They''ll be able to play the game, but with limited quality. I also wouldn''t actively "support" those, i.e. if someone has problems running on one of those you can tell them to update their hardware.

After that level, the next I''d have would be "limited 2 stage multitexture", i.e. the Voodoo2 and TNTs where I''d do things like enable light mapping


1h) Most don''t, but some do
http://www.nvidia.com/view.asp?IO=feature_entertain
http://www.nvidia.com/view.asp?IO=pacman

[Look carefully at the laptop screen in the first photo, then at the second link, last screenshot - I wrote that engine - it targetted all of the old graphics cards, even back to the Matrox Mystique]



2 a-j) "Did you have to experience it all by yourself"

- yep, see above, on Pac-Man:AIT.

1. we had a set of test machines set up with different OSes and different cards. We''d swap between the different cards and drivers to determine problems.

2. Many problems were actually driver related. When a problem was found, we''d try work arounds, check the code etc, if we still had problems we''d modify one of the DX SDK samples to reproduce the problem and send it off to the chip vendor concerned. Some fixed their drivers, some told us of other workarounds, some didn''t fix thier drivers. All bad driver revisions we''d found were listed in the readme.txt for the game.

3. we borrowed some cards we couldn''t get hold of from the IHVs for testing.

4. we attended the Meltdown conference where each of the IHVs had their own testing suite where we could test on new and future hardware and talk to the driver and hardware guys about any issues on a one-to-one basis. At the hotel bar later some other "off the record" information is usually exchanged , stuff like "that chip was crap, don''t bother supporting it, you''re seriously better off using a software rasteriser".

5. The publisher''s test department also performed their own compatibility testing [due to our own internal test procedures it was one of the least bugged PC games they''d ever seen ]


2k) Yep, I was talking about hardware T&L in GeForce256 and above (and technically the 3DLabs T&L card (VX1?) that came out before it )



--
Simon O''Connor
Creative Asylum Ltd
www.creative-asylum.com

Share this post


Link to post
Share on other sites
My question is, what do you mean by supporting 3dfx? You aren''t writing Glide code, are you? It shouldn''t be overly difficult simply making sure your game works with 3dfx cards, unless texture memory seems to be an issue, but I wouldn''t bother with Glide if that''s what you''re thinking. 3dfx cards ran OpenGL just fine (NOT Direct3D though!).

Also the comment about laptops is important. I''m a laptop user and have a 32MB GeForce2 which I think is nice but many games in the works won''t support it. But upgrading means buying a new laptop (or a desktop, but I like/need a laptop).

~CGameProgrammer( );

Share this post


Link to post
Share on other sites
"3dfx cards ran OpenGL just fine (NOT Direct3D though!)."

In my experience it was actually the other way round! [For DX7 and below]


Share this post


Link to post
Share on other sites
I have worked on 2 products that run well, using DX8.1, on Voodoo3 chipsets. They''re actually suprisingly fast cards for their age, performing better then the ATI Rage128 and NVidia TNT cards (which the products work for as well). Most multitexture capable 3D hardware (on the PC) supports DX8.

Share this post


Link to post
Share on other sites
quote:
Original post by CGameProgrammer
Well in 1999 I had a Voodoo2 which ran Half-Life well in OpenGL but horribly in Direct3D. But I guess the drivers have been fixed now.




I'd personally prefer not to use a game with a primarily OpenGL based engine as a benchmark for the quality of D3D drivers [i.e. I strongly suspect the D3D in that was a bit of an afterthought] .

It was 1998ish when we were first developing our D3D engine - and subsequently testing on the Voodoo (1, 2, Rush etc) stuff which IIRC required a lot less in the way of workarounds that certain other chips back then (I'm definately not saying 3Dfx's early D3D drivers were perfect - not at all! - I had plenty of reasons to curse them back then ).

[edit: CGameProgrammer's sig in the quote seems to have killed the message text]


[edited by - S1CA on June 4, 2003 6:29:34 AM]

Share this post


Link to post
Share on other sites
TO ALL : Sorry for late reply, haven`t been here for few days, I`ll split reply into few replies so that it isn`t a two-pages reply.

To : S1CA

quote:
1f) If you''re using DirectX 8 or above, the cut off will be whether there are DX6 drivers available for that hardware. Voodoo cards only just about get away with things because fans have produced partially updated drivers. Almost all really old chips won''t have had their drivers updated.
I still suppose that those DX 8.1 drivers work. I have noticed on other 3dfx forums that they managed to run Unreal 2 under DirectX, so that`s pretty cool.
quote:
Assuming there were drivers available I''d handle the really ancient chips as a special case that turned EVERYTHING apart from textured polygons off. With maybe some user settings to try and enable things.
Exactly my original idea. I probably won`t mess with finding out which combinatorical combinations of blending (or other stuff) work and which don`t. I will simply turn it off, so that it is at least playable.
quote:
Look carefully at the laptop screen in the first photo, then at the second link, last screenshot - I wrote that engine - it targetted all of the old graphics cards, even back to the Matrox Mystique
That`s a pretty cool game . I played it at my friend`s but it is was way too hard for us to finish even first few levels, so we dropped it. Did it seem too easy for you ?
Ehm, and how long did it actually take you to make it work on those cards ?
quote:
2 a-j) "Did you have to experience it all by yourself"

- yep, see above, on Pac-Man:AIT.
Whow, was it sort of a punishment from a publisher ? Or did you just feel that it would be cool to expand the possible market ?
quote:
stuff like "that chip was crap, don''t bother supporting it, you''re seriously better off using a software rasteriser".
LOL Could you please tell us specific chips ? I`m just curious...
quote:
due to our own internal test procedures it was one of the least bugged PC games they''d ever seen
...me puts the hat from head down...

S1CA, thanks I`ve actually printed all your replies and have it on my table so that it is reachable at any time. Many THANKS !


VladR
Avenger game

Share this post


Link to post
Share on other sites
To CGame::
quote:
My question is, what do you mean by supporting 3dfx? You aren''t writing Glide code, are you? It shouldn''t be overly difficult simply making sure your game works with 3dfx cards, unless texture memory seems to be an issue, but I wouldn''t bother with Glide if that''s what you''re thinking. 3dfx cards ran OpenGL just fine (NOT Direct3D though!).
Of course I`m not talking about Glide. Texture memory isn`t an issue since at Low detail I use just textures at res 128x128. My original point regarding this thread was to find out whether it is worth the time to buy 3dfx and according to DeviceCaps change the Render function.

As for the laptops issue, how many 3d cards are actually out there for laptops ? There`s GF2Go, some ATI mobility (is it Rage PRO ?) and I think there`s some mobile Savage. Anything more for laptops ?

VladR
Avenger game

Share this post


Link to post
Share on other sites
To : SuperDeveloper
quote:
I''m going to Slovakia at end of June!

We should meet up for some pivo and some pekne holki!
Cool A developer speaking Slovak language AND drinking beer ? Damn, you bet I`m looking forward to drinking few (a dozen at least) beers with you. BTW, what project are you working on ?
quote:
Which town do you live in?
Košice a je tu kopa nadhernych holiek.


VladR
Avenger game

Share this post


Link to post
Share on other sites
.."Košice a je tu kopa nadhernych holiek."

Damn I can imagine! My brother went to Bratislava and was SHOCKED by the amount of mlade holki, that, if Christina Agulera was your girlfriend, you''d end up cheat on her hahah!

Actually Im going in middle of July, not June, I was wrong. Flying to Praha, then driving through brno, piestani, Zeravice, and finally ending up on Poprad, I have most if my family in Poprad.. Im excited.

Anyhow I am a Star-Wars super-fan, and most of my work revolves around producing space-style game. I only made a few playless demos, but Im working on making a little space simulator where I can fly a Tie-Fighter into a Star Destroyer.

Who knows when that will be done..

www.cppnow.com

Share this post


Link to post
Share on other sites