Archived

This topic is now archived and is closed to further replies.

shaft

GL and Vertex Shader Extension Question

Recommended Posts

Quick question... If I write a program using NVidia''s vertex shader extension, does that mean the program will only run on nvidia cards, or does the extension provide a CPU/software implementation so anyone can run it, regardless of their hardware? I''ve never used extensions, so I don''t know much about''em. Thanks.

Share this post


Link to post
Share on other sites
NVidia cards only. Use glGetString to determine the type of card the user has and plan for the worst.

Share this post


Link to post
Share on other sites
Then how do you use vertex shaders in OpenGL? I want a solution that will work on any system, even if the card doesn''t support vertex shaders and runs on the CPU.

Do I have to switch to directX, and further support the monolith that is microsoft?

Share this post


Link to post
Share on other sites
quote:

Then how do you use vertex shaders in OpenGL? I want a solution that will work on any system, even if the card doesn''t support vertex shaders and runs on the CPU.



OpenGL doesn''t work that way. OpenGL is not designed to run CPU emulations, if HW is not present. OpenGL is designed for fully 100% hardware implementations. Some manufacturers give you software fallback in their drivers, but only for their own extensions. If you want to take advantage of a specific card''s special features, then you have to code special case code for every major manufacturer (actually nVidia, ATi, and generic fallback). OpenGL won''t do it for you, you can''t just say, hey I want vertex shaders on any card ! You''ll have to invest that time and do it yourself.

Or you could wait for OpenGL 2...

/ Yann

Share this post


Link to post
Share on other sites
Well, your apps will never run in every card if you want any newer features, since the i81x chips don''t run crap.

You can just code for the nVidia and ATI versions for now and update it when the OpenGL 2.0 spec (maybe 1.3 or 1.4) comes out. Most games in development should be scheduled to come out in 9-18 months and by that time OpenGL should have an implementation for that in the spec.

Share this post


Link to post
Share on other sites
>>OpenGL doesn''t work that way. OpenGL is not designed to run CPU emulations, if HW is not present<<

actually this is one of the things that differentuates(sp) d3d from opengl, correct opengl is meant to emulate a feature in software if the hardware cant do it. d3d ignores it.
apart from my usual rants on shaders , i get the feeling a lot of beginner programmeurs think they have to use them (im not knocking anyone personally )
why do they think they have to use them?
i believe they say to themselves my terrain engine (or whatever) looks crap compared to such + such game, give me shaders + my engine will look so much better, but folks it doesnt work that way.
give jeff beck a beatup accoustic guitar hell make good music, give a dude off the street the top range guitar + all the effects + he wont make good music.
i dont wanna blow my own horn but the shots below in gotterdammerung were all made by me on a vanta i think the most advanced thing i used was multitexture

http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites
quote:

actually this is one of the things that differentuates(sp) d3d from opengl, correct opengl is meant to emulate a feature in software if the hardware cant do it


Not really. I have yet to see one single perfragment feature that is correctly emulated in software, while the rest is HW. It simply doesn't work, hardware does not support that. With 99% of adavnced features, it's one or the other: full software or full hardware.

Esp. with extensions, there is no general software fallback. If HW doesn't support a feature, it simply does not define the extension. Period.

Vertex programs could be an exception, since they are software-emulateable. But as I mentioned, this is not the way it works. With some luck, you'll get one manufacturer that supports a software codepath for his *own* shader extensions (eg. vertex program on GF2), but those are incomplete most of the time and still manufacturer dependent. But there is no, and never will be, intercompatibility between multi-manufacturer extensions. You'll *never* get a software emulation of NV_vertex_program on your Radeon card (at least, not until nVidia merges with ATi .

Oh well, we're all waiting for OGL 2.0...

/ Yann

[edited by - Yann L on May 5, 2002 7:07:01 AM]

Share this post


Link to post
Share on other sites
>>Not really. I have yet to see one single perfragment feature that is correctly emulated in software, while the rest is HW. It simply doesn''t work, hardware does not support that. With 99% of adavnced features, it''s one or the other: full software or full hardware.<<

from what ive seen
dot3 bumpmapping on my vanta
some of the arb_imaging on my gf2mx




http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites
quote:

some of the arb_imaging on my gf2mx


Yes, granted, the imaging extensions are sometimes simulated in software, by reading back the FB and performing the operations on the CPU. But that's because the imaging subset is not an integral part of the fragment pipeline, it can also process on the FB alone. The accumulation buffer is another such example: it's perfectly software-emulateable (better not talk about performance though...)

quote:

dot3 bumpmapping on my vanta


Impossible. If a card does not support HW DOT3 in it's combiner pipeline (I don't know about the Vanta, but I guess not), then there is *no* way to emulate it in software, other than doing the whole rendering using a total software fallback.

You can't extract pixels from the fragment pipeline, do software stuff on them, and put them back onto the GPU. The hardware does not allow that, there is no GPU-external access point. This is why software emulation of 'advanced' features is very problematic: it's not an API question (D3D has exactly the same problem), it's a hardware limitation. Just as with eg. the stencil buffer: If your 3D card doesn't support it in HW, then *everything* will fall back to software.

/ Yann

[edited by - Yann L on May 5, 2002 4:06:18 PM]

Share this post


Link to post
Share on other sites
In response to "Why would I do this advanced stuff when it''s not necessary"... Because I want to.

The demo I''m working on has character animation in it. Linear interpolation is lame. The results you get with a spline curve rather than linear interpolation is amazing. Unfortunately moving along a spline, per vertex, per frame, causes a little bit of a performance hit. And I just read about shaders.

I''m not trying to create a complete game, I''m just creating a graphical demo to show off the scripting engine I created. And learn graphics at the same time. So I''m doing this stuff because I want to, and for no real critical reason.

Thanks for the input y''all. I think I''m going to switch to DX (arg... supporting the MS monopoly). There seems to be a lot of documentation on shaders in DX, and DX isn''t as bad as I remember it.

Share this post


Link to post
Share on other sites
>>In response to "Why would I do this advanced stuff when it''s not necessary"... Because I want to.<<
>>I''m not trying to create a complete game, I''m just creating a graphical demo to show off the scripting engine I created<<

hmmmm i get the feeling u think d3d is more advanced than opengl (where in fact the opposite is true)
opengl will ALWAYS be more advanced UNTIL d3d adds extensions also.
currently the only way d3d advances is when ms releases a new version, with extensions theres new stuff every month, have u ever asked yourself WHY 90% of new graphical stuff is accesable in opengl before it is in d3d?
simple, u want best performance/bleeding edge u choose opengl (no ifs or buts)

sorry about the rant but i do feel ppl get fed wrong info a lot of the time

http://uk.geocities.com/sloppyturds/gotterdammerung.html

Share this post


Link to post
Share on other sites
quote:

Thanks for the input y''all. I think I''m going to switch to DX


Hmm, giving up, as soon as a little challenge arises. Easy, isn''t it ? Oh well, I wish you good luck with D3D. No, it''s not bad as it used to be. But if you feel that OpenGL is to complex for you, and therefore want to move to D3D, then I''d recommend having a nice wall somewhere around your computer - in case you want to bang your head against it in frustration...

Share this post


Link to post
Share on other sites
I''m not switching from GL because it''s too hard. In fact it''s much easier than DX. But DX meets my requirements better. I want shaders, and I want them to work on any system regardless of the card vendors (It''s a demo I''ll be sending out to people).

If I were starting this project 4 months from now, I would probably stay with GL because hopefully 2.0 would be out by then. But I can''t wait that long.

Share this post


Link to post
Share on other sites
Hmmm..ZedZeek: Don''t be too proud of this OpenGL extension thingy.I was an OpenGL programmer for 4 years then I switched to D3D because of all the crap extensions that I had to deal with. Ever heard the saying "OpenGL: Death by Extensions"? There''s no poing programming non ARB specific extensions if you want to release your program for the masses.

As for which is more advanced than the other,I''d have to say none of them.They both do the exact same thing.I should know.I have now done alot of programming in both and I see no difference.The only thing I like about D3D is that I do not have to contend with extensions at all.Which is really sweet.

(Note: This is not to start a flame war or anything.But I was just correctin ZedZeek on this point.And no I am not one sided because I use BOTH API''s on a daily basis )

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Vertexshaders is one of the few things that MS emulates in SW if not the card can do it in HW.. things like Dot3, Cubemaps, Pixelshaders and so on you still have to check the CAPS for, and cheking the CAPS are similar to Extentions (ARB and EXT atleast) so you have the same hassle in DX

Share this post


Link to post
Share on other sites