Jump to content
  • Advertisement
Sign in to follow this  
Noxoa

OpenGL Shaders, Opengl 2.0 questions

This topic is 5396 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I have a few questions about shaders and graphic cards, so here we go: 1) Compatibility question: I have a geforce3 ti200, I know it has Vertex Shader. But does it has hardware supported Pixel Shaders? I would say yes (see Nvidia Chameleon demo), but I don't find any GL_ARB_fragment_program in the delphi3d.net hardware specs for my card. So theoretically It would not be possible to make usual Pixel Shaders, am I wrong? 2) Opengl 2.0 question How do you know if a graphic card is compatible with opengl2.0? Which ones are compatible? Is it a matter of GL_ARB extensions? If yes, then which extensions says that the card is OGL2.0 compatible? Any URL? 3) Which graphic card would you choose for hardware Opengl 2.0 shaders programming? Thanks in advance!

Share this post


Link to post
Share on other sites
Advertisement
1) it probably uses NV's shader interface to do the shading, you'll want to look for those extensions

2) It will probably be compatible with large chunks of it, but certainly not the GLSL spec (well, you might get vertex shader support)

3) Probably a NV 6800GT, but that would kinda be dependant on NV sorting out their drivers to support reading of textures in a vertex program and looping/branching in the fragment programs

Share this post


Link to post
Share on other sites
1/ the gf3 has limited pixelshaders
see here http://www.anandtech.com/video/showdoc.aspx?i=2195&p=3

basically (as far as nvidia goes) u need at least a gffx to do fragment programs but even these are limitted eg they cant do branching, the new radeons + gf6800 + 3dlabs cards can

Share this post


Link to post
Share on other sites
The new radeons can't branch in the pixel shader. They can use the old "compute both results and pick the relevant one" type of branching but the old radeons could do that as well.

Share this post


Link to post
Share on other sites
>>The new radeons can't branch in the pixel shader.<<
i dont have a radeon (only gffx) but i was under the impression from a piece at humus's webpage from a few months ago that the new radeons could do proper branching in the shaders

Share this post


Link to post
Share on other sites
I don't seem to have any problem with branching in GLSL with the 66.81 beta drivers (on a 6800GT), although it doesn't work with the latest "official" drivers on NVIDIA's site. (Side note, branching also worked on my Radeon 9500 Pro with Catalyst 3.7., but of course I don't know what the GLSL compiler was doing uder the hood.)

I'm pretty sure vertex texture reads work fine on 6800/66.81 too (but I gotta double check that when I get home.)

(Side question: Anyone know if the Radeon X-series can do vertex texture reads?)

Share this post


Link to post
Share on other sites
hi there. well,i have a radeon 9600 pro and it is supporting opengl 2.0. the idea is simple how to know: just check the extensions that have been promoted to core features in opengl 2.0 specification, as new features from opengl 1.5. than check your opengl extension strings and see if the new features(extensions) are in your extension string. for shading there must be extensions like "shader objects" "vertex shader" and "fragment shader". these are core features just in 2.0 version and are meant to be used with GLSL.

Share this post


Link to post
Share on other sites
Quote:
Original post by lancekt
(Side note, branching also worked on my Radeon 9500 Pro with Catalyst 3.7., but of course I don't know what the GLSL compiler was doing uder the hood.)

Remember that

bool testresult;
float val1 = 1;
float val2 = 2;
float valout;
if(testresult) valout = val1;
else valout = val2;

Can be done as

bool test;
float val1 = 1;
float val2 = 2;
float valout = ((float)test)*val1 + (1-(float)test)*val2;

Along with various other crazy stuff to fake looping and branching.

[Edited by - joanusdmentia on October 7, 2004 7:06:35 PM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!