Jump to content
  • Advertisement
Sign in to follow this  
dnaxx

Using gl_ext_vertex_shader

This topic is 4870 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, I am currently using a "old" graphics-card (ati radeon 9000 mobility) and this card does not support GL_ARB_VERTEX_SHADER, so I though that maybe "gl_ext_vertex_shader" could be used? - is this possible? In detail, I want to do multitexturing with the shader (one pass) instead of using the multitexture extension with multiple passes. [EDIT]Another question: is there a chance to use a shader on both, ATI and nVidia cards without using GL_ARB_VERTEX_SHADER?[/EDIT] Greetings,

Share this post


Link to post
Share on other sites
Advertisement
If the card supports it yes.

NVidia don't support this extension that I am aware of they have there own NV_vertex shaders instead. So you have a code path for each. But all NVidia from Geforce 3 and up support ARB vertex program.

You could write a parse to take ARB vertex programs to setup GL_EXT_vertex_programs. I never used GL_EXT, but I am pretty sure you can save it in a display list, and than later you can call the list to invoke the shader.

Share this post


Link to post
Share on other sites
Quote:
Original post by dnaxx
...In detail, I want to do multitexturing with the shader (one pass) instead of using the multitexture extension with multiple passes...
What are you trying to do that would require more than one pass with just the multi-texture extension while only requiring one pass with shaders?

Share this post


Link to post
Share on other sites
currently I do the following

pass1: RGB texture1
pass2: RGB texture2 combined with RGBA (only alpha used) texture1a.
pass3: RGB texture3 combined with RGBA (only alpha used) texture2a. Then a lightmap is modulated.

I already tried to reduce the number of passes (each pass decreases the fps very much) but I did not get it working. Another issue (I think) is, that nVidia cards only support 4 texture units (do they?).

Share this post


Link to post
Share on other sites
Quote:
Original post by Name_Unknown
If the card supports it yes.

NVidia don't support this extension that I am aware of they have there own NV_vertex shaders instead. So you have a code path for each. But all NVidia from Geforce 3 and up support ARB vertex program.

You could write a parse to take ARB vertex programs to setup GL_EXT_vertex_programs. I never used GL_EXT, but I am pretty sure you can save it in a display list, and than later you can call the list to invoke the shader.


can I use the same shading programms for both card-typs or are big modifications required?
I also want to use the GL_ATI_fragment_shader. Is there a similar nVidia function?

Share this post


Link to post
Share on other sites
Quote:
Original post by dnaxx
Quote:
Original post by Name_Unknown
If the card supports it yes.

NVidia don't support this extension that I am aware of they have there own NV_vertex shaders instead. So you have a code path for each. But all NVidia from Geforce 3 and up support ARB vertex program.

You could write a parse to take ARB vertex programs to setup GL_EXT_vertex_programs. I never used GL_EXT, but I am pretty sure you can save it in a display list, and than later you can call the list to invoke the shader.


can I use the same shading programms for both card-typs or are big modifications required?
I also want to use the GL_ATI_fragment_shader. Is there a similar nVidia function?


Unfortunately no. ATI and NVidia tended to use their own proprietary extensions until GLSL debut, and both support GLSL. Both also support the ARB assembler, but ATI has apparently given up supporting it and now only support GLSL fully.

Pre-GLSL cards are notoriously a pain to support as you have to use the vendor specific paths for each.

You can create something like a DirectX 8 shader parser and interpret it to the vendor extensions. I am sure that this is how DirectX 8 shaders actually work in the hardware, as NVidia had to translate them to Register Combiners and ATI had to translate it to their ATI stuff ( I don't know how it actually works, but the hardware is similar to how it is exposed in OpenGL so I am guessing).

Or you have to code seperate paths for the cards. You can usually setup your shader in a display list once and invoke the display list. NVParse has a limited DX8 shader parser... ATI provides no tools at all. Nope. You have discovered what alot of developers know.. Pre-GLSL OpenGL shaders is a pain in the neck.

Share this post


Link to post
Share on other sites
thanks. I think now I got it ;) and I'll buy a new geforce 6 card and use GLSL on my stationary PC instead of on my notebook.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!