GLSL & ATI

Started by
6 comments, last by aerkis 12 years, 8 months ago
Hello,
I have a strange crash with a simple GLSL code when I try to use it on an ATI card (It works perfectly on NVidia cards) but I don't know why. Is there a particular thing to do for ATI cards?
There is my GLSL code which try to get infos of the current texture in vertex shader.

Vertex shader

#extension GL_ATI_shader_texture_lod : enable

uniform sampler2D tex;
varying vec4 texColor;
void main()
{
gl_Position = ftransform();
gl_TexCoord[0] = gl_MultiTexCoord0;
texColor = texture2DLod(tex, gl_TexCoord[0].st, 0.0);
}


Fragment shader

varying vec4 texColor;
void main()
{
gl_FragColor = texColor;
}


With this simple test code, crash seems to be occured in ATI drivers (atioglxx.dll) however I have the last drivers.
Is there anyone to have an idea to fix this problem?
Thanks you by advance.
Advertisement
Perhaps because there is no "GL_ATI_shader_texture_lod" anymore? And trying to activate it makes the driver crash? just delete the #extension line
No, If i try to remove "GL_ATI_shader_texture_lod", the code crash again.The strange thing is that it seems working at the beginning (The texture is drawing) but after a refresh, my application crash.
I've encountered a similar problem using DirectX and ATI. There's device caps for if you can fetch from textures in the vertex shader, there is also a maximum amount you can do this. Some ATI cards report themselves as being able to fetch from texture in the vertex shader but report the maximum amount of times to be 0. This way they dont have to support vertex texture fetch but still comply with the appropiate shader model.
So the unique solution seems to wait for a new ATI drivers?
in my case it was the hardware's problem since it just didnt support it so a driver update would not be able to fix it. ATI's alternative to vtf was r2vb (render to vertex buffer) but i dont know if this is also available to OpenGL or if this suits your needs. Newer ATI cards should support vtf so you'll need to find out which cards dont and let those use a fallback.
Where does the crash occur in the host code? Is it inside a GL call? Which one?

What's the format of the texture bound to that slot?
In fact, the crash in the ATI drivers (in the host code, it's in a safe method but not in a GL call). The texture format is RGBA 32 bit (which is bind with glTexImage2D[color="#000000"](GL_TEXTURE_2D[color="#000000"], [color="#000080"]0[color="#000000"], GL_RGBA[color="#000000"],[color="#c0c0c0"] [color="#000000"]width[color="#000000"], [color="#000000"]height, [color="#000080"]0[color="#000000"], GL_RGBA, GL_UNSIGNED_BYTE[color="#000000"], data[color="#000000"]) and it works perfectly on NVidia cards). [color="#c0c0c0"]

This topic is closed to new replies.

Advertisement