Jump to content
  • Advertisement
Sign in to follow this  
athile

Current support for GLSL vs. ARB_fragment_program

This topic is 4836 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Does anyone have an practical experience for how well cards support the ARB_fragment_program extension versus GLSL? I realize that in theory, since GLSL and assembly fragment programs are both eventually going to reach the hardware in the same card-specific assembly language form, it shouldn't matter if I choose to write my shaders in GLSL or the ARB assembly form. However, I tend to find that with video drivers what should be true and what actually is true don't always match. To restate my question then: If I write my shaders using the ARB_fragment_program and ARB_vertex_program extensions, are more cards likely to support my shaders (and support them correctly) than if I use the newer GLSL language? There must be more GLSL driver bugs out there since it's newer, but I'm wondering if the difference is enough to justify the extra work of using the ARB assembly language rather than high-level GLSL. I'd like to support as many older cards as possible for my engine effects. [Also, I'm aware that cgc can convert GLSL to ARB assembly, so it's most likely I would code my shaders in GLSL and then use that automatically converted assembly as a starting point for the assembly version of the shaders.]

Share this post


Link to post
Share on other sites
Advertisement
AFAIK developement has stopped on ARB_fragment_program its not as powerful as glsl it doesnt support loops etc unlike glsl
ie shader languages are the future

Share this post


Link to post
Share on other sites
GLSL drivers on Nvidia and ATI have matured greatly now so they are usable in commercial products.

Cards that can do ARB_vp and ARB_fp also support GLSL except perhaps Volari, but who ows a Volari?

The exception is Realizm (3D labs), which only support GLSL.

Cards that can do ARB_vp only such as the Gf3/Gf4, support GLSL vertex shaders.
I think the old ATI cards don't support GLSL at all although they could just like the Gf3/Gf4.

Share this post


Link to post
Share on other sites
ARB_fragment_shader was completed by the ARB around 8-10 months after ARB_fragment_program so I would guess that all cards that support ARB_fp support ARB_fs as well.

GLSL is used for ARB_fs as well as OpenGL 2.0 so they are essentially the same. Your best bet overall (since OpenGL 2.0 support is still lacking in the drivers) is to use ARB_fs.

But please inform me if I am way off here.

Share this post


Link to post
Share on other sites
GLSL is the future.
Support is good and wide spread, as such you should be using it.
As zedzeek pointed out, the ARB_fp and _vp extensions are no longer being updated and never even made it to core, as such they will quickly become outdated and for someone new to shader programming it isnt worth learning them.

Share this post


Link to post
Share on other sites
Thank you for the feedback. I'm thinking ARB_fragment_shader is the way to go. Also, using the "OpenGL Hardware Registry" (http://www.delphi3d.net/hardware/allexts.php), it looks like I could probably use ARB_vertex_program to squeeze some vertex-only effects onto older cards [but, as was said by hitman200ca, it's probably not worth using ARB_fragment_program since most cards that support that, support ARB_fragment_shader].

As for the "GLSL support is good" comment, I would like to throw an asterisk on that statement. As I said I'm rather new to shader programming, but I've already run into the fact that my Radeon 9800 supports the ARB_fragment_shader extension - but then complains that that shadow2DProj is an unknown keyword when compiling a shader that uses that keyword. That same shader that it complains about works fine on an new nVidia card (I'm running the latest drivers on both). And the gl_ClipVertex keyword doesn't seem to be supported by nVidia (yet anyway, I didn't bother trying on ATI). I eventually found some documentation about nVidia's lack of support for gl_ClipVertex, but good luck finding anything about why shadow2DProj isn't supported on ATI cards. There are workarounds for both these issues, so it's not a big deal. The only point I'm trying to make here is that, even with my limited experience, I'd say that GLSL support has a ways to go before its mature.

Thanks for the replies everyone.

Share this post


Link to post
Share on other sites
Quote:
Original post by athile
As for the "GLSL support is good" comment, I would like to throw an asterisk on that statement. As I said I'm rather new to shader programming, but I've already run into the fact that my Radeon 9800 supports the ARB_fragment_shader extension - but then complains that that shadow2DProj is an unknown keyword when compiling a shader that uses that keyword. ... The only point I'm trying to make here is that, even with my limited experience, I'd say that GLSL support has a ways to go before its mature.


I dont know why it doesnt work for you, but I just tested an old shader of mind which uses shadow2DProj() and it compiles and works on my X800XT, which is basically the same core as the 9800xt, as such you either (a) did something wrong when using it or (b) need to update your drivers [smile]

The fact that many many software vendors are using GLSL in their products is proof that its ready for use.

Share this post


Link to post
Share on other sites

I agree. Make sure you check the version strings at the top of your program just incase. Maybe phantom you could post your shader that uses shadow2Dproj().

Here is the code to check your card driver information.


// Print OpenGL driver info
printf("GL_VENDOR: %s\n", glGetString(GL_VENDOR));
printf("GL_VERSION: %s\n", glGetString(GL_VERSION));
printf("GL_SL_VERSION: %s\n", glGetString(GL_SHADING_LANGUAGE_VERSION));
printf("GL_RENDERER: %s\n", glGetString(GL_RENDERER));
printf("GL_EXTENSIONS: %s\n", glGetString(GL_EXTENSIONS));
printf("\n\n");


Make sure the shading language version is 110.
This is what it should look like if you have the latest ATI driver set as of Aug 2005.

GL_VENDOR:        ATI Technologies Inc.
GL_VERSION: 2.0.5220 WinXP Release
GL_SL_VERSION: 1.10
GL_RENDERER: MOBILITY RADEON 9600 x86/SSE2
GL_EXTENSIONS: GL_ARB_multitexture GL_EXT_texture_env_add GL_EXT_compiled_ver
tex_array GL_S3_s3tc GL_ARB_depth_texture GL_ARB_fragment_program GL_ARB_fragmen
t_program_shadow GL_ARB_fragment_shader GL_ARB_multisample GL_ARB_occlusion_quer
y GL_ARB_point_parameters GL_ARB_point_sprite GL_ARB_shader_objects GL_ARB_shadi
ng_language_100 GL_ARB_shadow GL_ARB_shadow_ambient GL_ARB_texture_border_clamp
GL_ARB_texture_compression GL_ARB_texture_cube_map GL_ARB_texture_env_add GL_ARB
_texture_env_combine GL_ARB_texture_env_crossbar GL_ARB_texture_env_dot3 GL_ARB_
texture_mirrored_repeat GL_ARB_transpose_matrix GL_ARB_vertex_blend GL_ARB_verte
x_buffer_object GL_ARB_vertex_program GL_ARB_vertex_shader GL_ARB_window_pos GL_
ATI_draw_buffers GL_ATI_element_array GL_ATI_envmap_bumpmap GL_ATI_fragment_shad
er GL_ATI_map_object_buffer GL_ATI_separate_stencil GL_ATI_texture_env_combine3
GL_ATI_texture_float GL_ATI_texture_mirror_once GL_ATI_vertex_array_object GL_AT
I_vertex_attrib_array_object GL_ATI_vertex_streams GL_ATIX_texture_env_combine3
GL_ATIX_texture_env_route GL_ATIX_vertex_shader_output_point_size GL_EXT_abgr GL
_EXT_bgra GL_EXT_blend_color GL_EXT_blend_func_separate GL_EXT_blend_minmax GL_E
XT_blend_subtract GL_EXT_clip_volume_hint GL_EXT_draw_range_elements GL_EXT_fog_
coord GL_EXT_framebuffer_object GL_EXT_multi_draw_arrays GL_EXT_packed_pixels GL
_EXT_point_parameters GL_EXT_rescale_normal GL_EXT_secondary_color GL_EXT_separa
te_specular_color GL_EXT_shadow_funcs GL_EXT_stencil_wrap GL_EXT_texgen_reflecti
on GL_EXT_texture3D GL_EXT_texture_compression_s3tc GL_EXT_texture_cube_map GL_E
XT_texture_edge_clamp GL_EXT_texture_env_combine GL_EXT_texture_env_dot3 GL_EXT_
texture_filter_anisotropic GL_EXT_texture_lod_bias GL_EXT_texture_mirror_clamp G
L_EXT_texture_object GL_EXT_texture_rectangle GL_EXT_vertex_array GL_EXT_vertex_
shader GL_HP_occlusion_test GL_NV_blend_square GL_NV_occlusion_query GL_NV_texge
n_reflection GL_SGI_color_matrix GL_SGIS_generate_mipmap GL_SGIS_multitexture GL
_SGIS_texture_border_clamp GL_SGIS_texture_edge_clamp GL_SGIS_texture_lod GL_SUN
_multi_draw_arrays GL_WIN_swap_hint WGL_EXT_extensions_string WGL_EXT_swap_contr
ol

Share this post


Link to post
Share on other sites
Its a pretty simple shader;

uniform sampler2DShadow shadowMap;
uniform sampler2D lightTex;

varying vec4 projCoord;

const vec4 ambient = vec4(0.13), boost = vec4(1.06);

void main()
{
vec4 lightValue = texture2DProj(lightTex, projCoord);
vec4 shadowValue = shadow2DProj(shadowMap, projCoord);
gl_FragColor = boost * gl_Color * lightValue * shadowValue + ambient;
}



Works perfectly (if not slowly on my X800 as there is a problem with depth readback which causes the framerate to die)

Share this post


Link to post
Share on other sites
Thanks phantom. I figured out my problem and, as is usually the case, it was embarassingly simple. I had declared my sampler as a "sampler2D", not a "sampler2DShadow". Works on both my ATI and my nVidia card now. I appreciate the help.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!