Archived

This topic is now archived and is closed to further replies.

JonnyQuest

Texture blending - hardware specifications?

Recommended Posts

Hi, I've been experimenting with texture blending (after all, not everyone has pixel shaders, and you can, in theory, do much with blending as well) Now, the problem is, blending support varies a lot between hardware. For example, the Riva TNT supports 8 Stages (!) apparently, and the ATI Rage 128 supports only 3, the third of which is very very limited. Those are the only cards I could find details for. I tried the TNT setup on my GeForce4 440 Go, but it didn't work (DX's ValidateDevice reported conflicting render state or something) - I can't even get 3 stages to work, only 2. My question is: is there any way I can find out - vendor specifications, white papers, etc. - that I can refer to for such things? The enumerated data is pretty useless as it (MaxTextureBlendStages in DX) reports 8 for my GeForce4 Go. Well, I'm too stupid to even get 3 to work, as it seems... The Rage 128 correctly reports 3 by the way. (I had a friend check) Remember I'm talking about blending stages here, NOT simultaneous textures. I want to do blending between the various stage registers. (texture, temp, diffuse, current, tfactor, etc.) So, are there any online documents available for that purpose? I can't seem to find any. If they're only available for nVidia registered developers I think I'm going to cry... EDIT: By the way, I realize I can use at least as many blending stages as the maximum number of textures, but some cards apperently support more, so it'd be nice to be able to use those. - JQ Full Speed Games. Coming soon. [edited by - JonnyQuest on September 3, 2002 9:19:20 AM]

Share this post


Link to post
Share on other sites
Apparently, the Nvidia hardware allows you to use those 8 stages in a special way in order to enable access to the hardware combiners. I've never used this myself, but there's an old paper on Nvidia's site describing how to do this for the TNT.

I'm not sure if you can find out the number of "normal" texture stages programmatically - in my code I've simply assumed it's the same as the maximum number of simultaneous textures for Nvidia hardware. Better suggestions are welcome.

[edited by - spock on September 3, 2002 9:52:10 AM]

Share this post


Link to post
Share on other sites
I know that TNT paper - I was referring to that in my post above, but it hasn''t been very helpful

A non-programmatic version would be an improvement already - I can implement a graphics card model detection if that''s necessary, and then look it up in there.

- JQ
Full Speed Games. Coming soon.

Share this post


Link to post
Share on other sites
Well, I just wanted to point out that the reason for not being able to use 3 stages on the Geforce4 440 Go is that the 8 stages is a (very) special case. The number of "normal" stages is not indicated in the D3D caps (unless there''s some obscure bit of Nvidia documentation somewhere explaining how we''re supposed to do it). That one had me stumped at first; I incorrectly assumed 1-8 stages would be usable.

I''m having a hard time getting these things to work properly myself, so I''m not in a position to help you. Of course, what REALLY gets you mad is when ValidateDevice() reports that everything is fine and it still won''t work because of driver issues.

Share this post


Link to post
Share on other sites
Ouch, at least the ValidateDevice() problem hasn''t affected me so far.

And yes, I do realize that the 8 stages are special, unfortunately they seem to be so special that they are hardly usable. Using triadic functions (LERP, etc.) you can probably do almost the same thing in only 2 stages on modern hardware.

I''m not accusing you of not being able to help by the way - I''m just pretty pissed off at the vendors for not documenting this right now and I''m a bit touchy on the subject because I wasted a weekend on it.

- JQ
Full Speed Games. Coming soon.

Share this post


Link to post
Share on other sites