glsl shaders on old cards

Started by
7 comments, last by rewolfer 14 years, 9 months ago
I just tried my program on my friend's pc which has a geforce 6800, and it threw a "send/dont send" windows error message thingy. I'm using GLEW to handle extensions, and for shaders I'm just using functions such as glUseProgram, glUniform, glShaderSource etc. Am I supposed to use other functions in order to support older architectures. according to GLEW his card only supports opengl 1.5, and I'm not sure when glsl programs started becoming supported. I googled a bit, and it seems people do use shaders on 6800's, so is it a programming fault, or should he update drivers (he's getting back to me on the version number lol). thanks rewolfer edit: lol he apparently has version 6.6.7.4 (september 2004), so i told him to get 190.38 now.
Advertisement
AFAIK GLSL was introduced in 1.5 but as an extension at first. So you might use the ...ARB versions in that case.

Edit: according to this wikipedia article it was introduced as a 1.4 extension, and promoted to core in 2.0 (which means pre 2.0 the methods would probably have the ...ARB extension).
If I was helpful, feel free to rate me up ;)If I wasn't and you feel to rate me down, please let me know why!
thanks, the arb extension works. :)
If I change my app to use the old extension-provided functions to support older cards, will it change any performance on newer cards? or should i rather bind the appropriate functions at run-time after checking the ogl version?
i just tested the new version on my laptop (which is also opengl1.5) and it is soooo slow. as soon as i turn off the shaders its great though. Does it revert to software emulation, or is the radeon x700 just too crap?
I'm not sure. How complex are your shaders?
If I was helpful, feel free to rate me up ;)If I wasn't and you feel to rate me down, please let me know why!
not at all complicated, just does the simple lighting models blinn, phong, diffuse per-pixel for directional light. and there's only one light, so one pass. no reason it should be slow in my opinion.
Well, I'm not entirely sure but if you use instructions that aren't compatible with the GPU's supported shader model (which is 2.0 for the x700) it might revert to software emulation - at least for vertex shaders.

Also, if you read many textures in the shaders the data bus might be the bottleneck.

You could try and use really simple shaders (like transformation only in the VS and outputting vertex color in the PS) to check if the shaders might be too complex for the card.
If I was helpful, feel free to rate me up ;)If I wasn't and you feel to rate me down, please let me know why!
Radeon x700 should have GL 2.1 drivers.
My old Radeon 9500 even has GL 2.1 support.
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
well its a Mobility Radeon, though it shouldn't really make a difference. Went to the ATI site, but none of the drivers they supply actually recognise my card. So I went to the ACER (laptop brand) site for drivers, but apparently their latest video driver is the one i've got. Strange. Maybe GLEW is just bad at identifying the opengl version, because it says my gtx295 is only opengl2.0, GLEW_VERSION_2_1 is 0.
The most complicated thing in the shader is the if-statement, maybe that's the problem... nope, just tried removing the branch - still slow.

AND it draws all the mesh's edges for some reason.

Anyways, I'm not going to waste time fiddling.
Thanks for the help

rewolfer

This topic is closed to new replies.

Advertisement