glGetUniformLocation not working

Started by
4 comments, last by Maxjen 10 years, 7 months ago

Hi,

I am trying to use uniform variables in my shader but for some reason glGetUniformLocation always returns -1. I am mostly following Lazy Foo's tutorial. Here is the relevant code:


PointLineShader::PointLineShader() {
    GLuint vertexShaderId = loadShader("Shader/pointLine.vert", GL_VERTEX_SHADER);
    GLuint fragmentShaderId = loadShader("Shader/pointLine.frag", GL_FRAGMENT_SHADER);

    programId = glCreateProgram();
    glAttachShader(programId, vertexShaderId);
    glAttachShader(programId, fragmentShaderId);
    glLinkProgram(programId);

    glDetachShader(programId, vertexShaderId);
    glDetachShader(programId, fragmentShaderId);

    glDeleteShader(vertexShaderId);
    glDeleteShader(fragmentShaderId);

    positionLocation = glGetAttribLocation(programId, "in_Position");
    if (positionLocation == -1) {
        printf("%s is not a valid glsl program variable!\n", "in_Position");
    }

    colorLocation = glGetAttribLocation(programId, "in_Color");
    if (colorLocation == -1) {
        printf("%s is not a valid glsl program variable!\n", "in_Position");
    }

    // this doesn't work
    projectionMatrixLocation = glGetUniformLocation(programId, "projectionMatrix");
    if (projectionMatrixLocation == -1) {
        printf("%s is not a valid glsl program variable!\n", "projectionMatrix");
    }

    modelViewMatrixLocation = glGetUniformLocation(programId, "modelViewMatrix");
    if (modelViewMatrixLocation == -1) {
        printf("%s is not a valid glsl program variable!\n", "modelViewMatrix");
    }
}

And the vertex shader:


#version 400

uniform mat4 projectionMatrix;
uniform mat4 modelViewMatrix;

in vec2 in_Position;
in vec4 in_Color;
out vec4 ex_Color;
 
void main(void)
{
	gl_Position = gl_ProjectionMatrix * gl_ModelViewMatrix * vec4(in_Position.x, in_Position.y, 0.0, 1.0);
	//gl_Position = projectionMatrix * modelViewMatrix * vec4(in_Position.x, in_Position.y, 0.0, 1.0);
	ex_Color = in_Color;
}

To initialize openGL I am using SDL2 and glew. I noticed that I set the openGL version to 3.2 in SDL, but in the shaders I have "#version 400". If I change or remove it the shader won't compile.

I am not sure if it might be related, but textures also don't work right now even though I am using the same code as in my other program which uses SFML except for the image loading functions.

Advertisement

Call glUseProgram before getting the attributes. Also, "projectionMatrix", "modelViewMatrix", and possibly "in_Color" won't exist since they are being optimized out.

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]

Call glUseProgram before getting the attributes. Also, "projectionMatrix", "modelViewMatrix", and possibly "in_Color" won't exist since they are being optimized out.

That didn't work sad.png

Also, I think the variables that are being optimized out are gl_ProjectionMatrix, etc. That is why I am trying to use my own variables.

Maybe I am doing something wrong with the initialization? This is how I create my openGL context:


    if (SDL_Init(SDL_INIT_VIDEO) < 0) // Initialize SDL's Video subsystem
        sdldie("Unable to initialize SDL"); // Or die on error

    // Request opengl 3.2 context.
    // SDL doesn't have the ability to choose which profile at this time of writing,
    // but it should default to the core profile
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
    SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 2);

    // Turn on double buffering with a 24bit Z buffer.
    // You may need to change this to 16 or 32 for your system
    SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
    SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);

    // Create our window centered at 512x512 resolution
    mainwindow = SDL_CreateWindow(PROGRAM_NAME, SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED,
        screenWidth, screenHeight, SDL_WINDOW_OPENGL | SDL_WINDOW_SHOWN);
    if (!mainwindow) // Die if creation failed
        sdldie("Unable to create window");

    checkSDLError(__LINE__);

    // Create our opengl context and attach it to our window
    maincontext = SDL_GL_CreateContext(mainwindow);
    checkSDLError(__LINE__);


    // This makes our buffer swap syncronized with the monitor's vertical refresh
    SDL_GL_SetSwapInterval(1);

    GLenum status = glewInit();
    if(status != GLEW_OK) {
        fprintf(stderr, "INFO: glew couldn't be initialized. Exit\nGLEW Error: %s", glewGetErrorString(status));
        close();
    }

In your shader, using your own uniform is correct, don't use gl_*. Use #version 400 core and they should give compiler errors as they're removed from the language.

Anyway your problem seems to be this:

glDetachShader(programId, vertexShaderId);
glDetachShader(programId, fragmentShaderId);

Remove that code.


Anyway your problem seems to be this:


    glDetachShader(programId, vertexShaderId);
    glDetachShader(programId, fragmentShaderId);

Remove that code.

Actually, isn't it ok to remove the shaders after the program is linked? I think it's fine since the program already got everything it needs.

Derp

In your shader, using your own uniform is correct, don't use gl_*. Use #version 400 core and they should give compiler errors as they're removed from the language.

Anyway your problem seems to be this:


    glDetachShader(programId, vertexShaderId);
    glDetachShader(programId, fragmentShaderId);

Remove that code.

Ok, I figured it out. The problem was that I didn't use the variables so they were compiled out. Maybe that is what Geometrian meant. The detaching of the shader after linking should be okay: http://stackoverflow.com/questions/9113154/proper-way-to-delete-glsl-shader

But thanks for the #version 400 core tip. Now I only have to solve the texture problem.

This topic is closed to new replies.

Advertisement