Jump to content
  • Advertisement
Sign in to follow this  
DUDVim

glsl gl_LightSource array must be redeclared

This topic is 4467 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello I have some problems compiling this (Vertex) program on my ATI x1600.
vec4 GetLight(int iLightID)
{
	return gl_LightSource[iLightID].ambient;
}

void main()
{	
	//gl_FrontColor = gl_LightSource[0].ambient;
	gl_FrontColor = GetLight(0);
	gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
}


The error i get is: ERROR: 0:3: '[' : array must be redeclared with a size before being indexed with a variable ERROR: 1 compilation errors. No code generated. As gl_LightSource is a built-in uniform variable one would think it should work anyway and it should know it's own size. If i try to redeclared it with a size it tells me that 'gl_LightSource' : reserved built-in name :(. I also tried the code with the 3DLABS GLSL Validator it say the code is good. Anyone have any ideas why my ATI card hate the code :( ?.

Share this post


Link to post
Share on other sites
Advertisement
I am having exactly the same problem with my Radeon x600 - does anyone know whats wrong? I'm assuming it is a harware specific glsl implementation issue, is there anywhere were you can find information on this I can't find much on the ATI developer site, but perhaps you need to be a registered ati developer or something...

If you access the gl_LightSource array with a constant value, it works
ie:
gl_LightSource[5].linearAttenuation works fine, but gl_LightSource[x].linearAttenuation does not.


Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!