Opengl and Vertex and Pixel Shaders

Started by
7 comments, last by Azrael 22 years, 9 months ago
I was checking nvidia site to get some info on how to run vertex and pixel shaders on the opengl SDK (Great stuff! a must download for any developer!) BUT.. I currently have a nvidia riva tnt2 (dont worry I already ordered my geforce 2 it should arrive in a month or two) and It doesnt say anything about how to use vertex shaders in a card without hardware support for it, is there a way to use this shaders in my riva at least for testing?
Advertisement
Correct me if I am wrong, but I don''t believe that OpenGL has "vertex and pixel shaders" like you would in Direct3D. So that could be your problem right there

------------------------------
Trent (ShiningKnight)
E-mail me
OpenGL Game Programming Tutorials
I think there is something similar but the reason you can't get it in software is because it is not standard OpenGL, rather it is an extension which will only work if your current video card driver supports it. You will need to test for hardware support and if it is there, use the shader, if not use another method.

Anything that is standard OpenGL it will automatically switch to software rendering if the card does not support it.

Seeya
Krippy

Edited by - krippy2k on June 28, 2001 1:36:35 AM
OpenGL does have vertex and pixel shaders, they are accessible via extensions. Pixel shaders arent accessible unless your hardware has support for them, but Vertex programs (ogl equiv of vertex shaders) are emulated by the drivers for older geforce cards (not sure about TNT cards).
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack
DX8 has some software support for this. Vertex shaders is pretty OK in sw but I think that you can get some big slowdowns compare to fixed functions. Pixel shaders is hopeless without the proper hardware.

In OpenGL is vertex programs the same as vertex shaders for nvidia cards. I think "texture shaders" is used instead of "pixel shaders". ATI and others is working on an extension with functions instead of the assembly language used in the original vertex shaders.

You need a GF3 for full hardware support. Any GeForce can also be used with OpenGL but some features will be emulated.
Correct me if I''m wrong, but can''t anything be software emulated? Pixel shadding (im assuming this means bump-maps) would be very easy in software. You would have a little equation for the angle the pixel should be shaded for, based on its surrounding pixels'' heights. I can''t describe it all right here, not without some diagrams, but I''m going to write an article about it now so I can work out all the details. This will be a nice project.. If you want to read it email me and I''ll tell you when its done. I''ll do it tomorrow.

(http://www.ironfroggy.com/)(http://www.ironfroggy.com/pinch)
Ok here is what I got:

If the values for the map at all the pixels around the current pixel is A through H, starting from the top-left and clockwise from there, the angles on the X axis and Y axis can be found with these equations:

Y = (H - D) + (C - G)/2 + (A - E)2
X = (B - F) + (C - G)/2 + (A - E)2

either I am horribly wrong for a genius cause I just scribbled a few things down and got that in about a minute.

You can use the light angles on this data to get the bumps.

(http://www.ironfroggy.com/)(http://www.ironfroggy.com/pinch)
Yes, well you can emulate just about anything you want using software if you write the code. Doesn''t necessarily mean it will be fast enough to be useful though.

Seeya
Krippy
Pixel shaders != pixel shading
I suggest you go read up on pixel shaders @ nVidia''s developer site for a proper explanation of what they are. They are a damn lot better than bump mapping alone, thats for sure.
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack

This topic is closed to new replies.

Advertisement