glUniform*f seems to... not work.

Started by
1 comment, last by violet girl 16 years, 5 months ago
I'm having the weirdest problem. It seems like glUniform*f just isn't working right. This is reproducible on multiple systems, all using Quadros and the latest nVidia drivers, so I feel like it must be something I'm doing wrong. I just can't figure out what that something is. To make matters more fun, the same code works for int uniforms with only the slight modifications you'd expect. Here's the relevant code:

char * source[1] = { "float mine; void main() { gl_FragColor = vec4(mine, 0, 0, 0); }" };

int program = glCreateProgram();
int shader = glCreateShader(GL_FRAGMENT_SHADER);
printf("glerror=%d\n", glGetError());
glShaderSource(shader, 1, source, NULL);
glCompileShader(shader);
glAttachShader(program, shader);
glLinkProgram(program);

int len;
glGetProgramInfoLog(p, 1023, &len, c);
printf("program info:\n%s\nend of program info.\n", c);
printf("isProgram(%d)=%s\n", p, glIsProgram(p) ? "true" : "false");
glUseProgram(p);

int cp = -1;
glGetIntegerv(GL_CURRENT_PROGRAM, &cp);
printf("current program=%d\n", cp);
   
printf("program handle=%d\n", p);
printf("shader handle=%d\n", s);

int loc = glGetUniformLocation(p, "mine");
printf("uniform location=%d\n", loc);
glUniform1f(loc, 0.7f);

// Draw something...
Here's what that code prints:

glerror=0
program info:

end of program info.
isProgram(1)=true
current program=1
program handle=1
shader handle=2
uniform location=0
getUniformfv(1, 0)=36893488147419103232.000000
That... isn't right. 3689... is also 0x0002.0000.0000.0000.0000 which might be coincidental, or it might be indicative of some deep problem. If I change glUniform1f(0.7) to 0.5, everything's the same, except now the getUniform1f call prints 0.000000. Which is still not equal to 0.5. Relevantly, I've run this in glsldevil, and its trace shows glUniform1f as executing on the messed up values. I'm not sure where it injects itself into the GL pipeline, though, so I'm not sure what that indicates. And, again, when I use integers, all this works fine. Which is fine, if I want to pass parameters as shorts. Help? The actual complete test code follows.
#include <GL/gl.h>
#include <GL/glext.h>
#include <GL/glut.h>
#include <stdio.h>

int p;

void draw() {
  glUseProgram(p);
  glClear(GL_COLOR_BUFFER_BIT);
  glBegin(GL_TRIANGLE_STRIP); {
    glTexCoord2f(0.0f, 0.0f); glVertex2f(0.0f, 0.0f);
    glTexCoord2f(1.0f, 0.0f); glVertex2f(1.0f, 0.0f);
    glTexCoord2f(0.0f, 1.0f); glVertex2f(0.0f, 1.0f);
    glTexCoord2f(1.0f, 1.0f); glVertex2f(1.0f, 0.8f);
  } glEnd();
  glutSwapBuffers();
}

char * shaders[1] = {
  "uniform float mine; uniform int imine; void main() { gl_FragColor = vec4(mine, 0.0, 0.0, 1.0); }"
};

int main( int argc, char *argv[] ) {
  char c[1024];
  c[0] = 0;
  
  glutInitWindowSize( 400, 400 );
  glutInit( &argc, argv );
  glutInitDisplayMode( GLUT_RGB | GLUT_DOUBLE | GLUT_DEPTH );
  glutCreateWindow( "OpenGL Demonstration 12" );
  
  p = glCreateProgram();
  int s = glCreateShader(GL_FRAGMENT_SHADER);
  printf("glerror=%d\n", glGetError());
  glShaderSource(s, 1, shaders, NULL);
  glCompileShader(s);
  glAttachShader(p, s);
  glLinkProgram(p);
  int len;
  glGetProgramInfoLog(p, 1023, &len, c);
  printf("program info:\n%s\nend of program info.\n", c);
  printf("isProgram(%d)=%s\n", p, glIsProgram(p) ? "true" : "false");
  glUseProgram(p);
  int cp = -1;
  glGetIntegerv(GL_CURRENT_PROGRAM, &cp);
  printf("current program=%d\n", cp);
   
  printf("program handle=%d\n", p);
  printf("shader handle=%d\n", s);

  int loc = glGetUniformLocation(p, "mine");
  //int loc = glGetUniformLocation(p, "imine");
  printf("uniform location=%d\n", loc);
  glUniform1f(loc, 0.7f);
  //glUniform1i(loc, 100);

  float retval;
  glGetUniformfv(p, loc, &retval);
  printf("getUniformfv(%d, %d)=%f\n", p, loc, retval);
  
  glMatrixMode(GL_PROJECTION);
  glLoadIdentity();
  glOrtho(0.0, 1.0, 0.0, 1.0, 0.0, 1.0);
  
  glMatrixMode(GL_MODELVIEW);
  glLoadIdentity();
  glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
  
  glutDisplayFunc(draw);  
  glutMainLoop();
  return 0;
}
Advertisement
I've just tried your example program under linux with the 64 bit 100.14.03 nvidia drivers with an 8600 GTS card and it works OK. (I had to change the type of the shaders global to be const GLchar *).

The output was:

glerror=0
program info:

end of program info.
isProgram(1)=true
current program=1
program handle=1
shader handle=2
uniform location=0
getUniformfv(1, 0)=0.700000


It looks like it might be a driver problem. It might be worth trying an earlier version just in case that makes any difference.
That's comforting and also spooky. I'd expect a lot more people to suffer from a driver issue like this. (I'm on Linux64, too, nVidia driver 100.14.23. I was using the stable 100.24.11 before though, and that exhibited the same behavior).

I'll try some older drivers, and if they work, I'll file a bug with nVidia.

This topic is closed to new replies.

Advertisement