Jump to content
  • Advertisement
Sign in to follow this  
X Abstract X

OpenGL GLSL Basics

This topic is 2986 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. I'm trying to incorporate VBOs and shaders into my OpenGL application but when I run it, all I get is a black screen. The shader program gets linked fine, and using glGetProgramiv() everything seems to be fine -- shaders attached, uniforms active, etc. I also checked for OpenGL erros and it was absent of errors. I also know that my projection matrix is set up correctly, I`ve used it before. I can't seem to figure out what the problem is. I would appreciate greatly if someone could look at my shaders and the code to create the VBO. Thanks. Vertex Shader
#version 130

uniform mat4 projectionMatrix;
uniform mat4 modelviewMatrix;

in vec4 in_vertex;

void main() {
    gl_Position = projectionMatrix * modelviewMatrix * in_vertex;
}

Fragment Shader (yes I realize gl_FragColor is deprecated)
#version 130

void main() {
    gl_FragColor = vec4(0.8, 0.0, 0.6, 1.0);
}

VBO Setup
void drawTriangle() {
    const unsigned int numVertices = 3;
    const unsigned int numIndices = 3;

    //Vertex Data
    Vertex vertices[numVertices];
    vertices[0].pos = Vector3(0.0, 32.0, -128.0);
    vertices[1].pos = Vector3(-32.0, -32.0, -128.0);
    vertices[2].pos = Vector3(32.0, -32.0, -128.0);

    //Index Data
    unsigned int indices[numIndices];
    indices[0] = 0;
    indices[1] = 1;
    indices[2] = 2;

    //Create VAO
    unsigned int vaoID = 0;
    glGenVertexArrays(1, &vaoID);
    glBindVertexArray(vaoID);

    //Create a VBO to hold the vertex data
    unsigned int vboID = 0;
    glGenBuffers(1, &vboID);
    glBindBuffer(GL_ARRAY_BUFFER, vboID);
    glBufferData(GL_ARRAY_BUFFER, numVertices * sizeof(Vertex), vertices, GL_STATIC_DRAW);

    //Setup attributes
    //vertex is 3 doubles for pos, 3 doubles for norm, 2 doubles for tex, 4 doubles for col
    glVertexAttribPointer(0, 3, GL_DOUBLE, GL_FALSE, sizeof(Vertex), (void*)0);
    glVertexAttribPointer(1, 3, GL_DOUBLE, GL_FALSE, sizeof(Vertex), (void*)24);
    glVertexAttribPointer(2, 2, GL_DOUBLE, GL_FALSE, sizeof(Vertex), (void*)48);
    glVertexAttribPointer(3, 4, GL_DOUBLE, GL_FALSE, sizeof(Vertex), (void*)64);

    glEnableVertexAttribArray(0);
    //glEnableVertexAttribArray(1);
    //glEnableVertexAttribArray(2);
    //glEnableVertexAttribArray(3);

    //Create a VBO to hold the index data
    unsigned int vboIndicesID = 0;
    glGenBuffers(1, &vboIndicesID);
    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, vboIndicesID);
    glBufferData(GL_ELEMENT_ARRAY_BUFFER, numIndices * sizeof(unsigned int), indices, GL_STATIC_DRAW);

    //Unbind the VAO
    glBindVertexArray(0);

    //Draw
    glBindVertexArray(vaoID);
    glDrawElements(GL_TRIANGLES, numIndices, GL_UNSIGNED_INT, (void*)0);
}

Share this post


Link to post
Share on other sites
Advertisement
Sorry that I don't know what your problem is at the moment, but I just want to suggest when debugging to add one thing at a time. If you don't know if the problem is in your VBO or your shader, try VBO with fixed-function, or try shader with immediate mode to rule out one or the other. You could try VBO without VAO too to remove a layer of complexity (it might be really simple, but I haven't used VAO so I can't really debug that part)

Best of luck.

Share this post


Link to post
Share on other sites
Heya :)

I'm not sure where the problem lies specifically but here's two things that you should check regarding GLSL:

- Do check that you have glUseProgram and you're using the right index for it, and you should be calling it before glBegin for vertices you want shaded.

- Set your uniforms after glUseProgram, and before glBegin

And here's just another suggestion for replacing that vert-shader code with something slightly simpler which I think does the same thing:


void main(void)
{
gl_Position = ftransform();
}



Just do what karwosts said about debugging it part by part, and when it comes to GLSL, it really comes down to doing things in the right order to get it working.

Hope that's of any help :)

Share this post


Link to post
Share on other sites
I'd suggest doing this:
1) only draw a triangle, check whether it's visible
2) if it is, activate your shader and check whether the triangle is still visible, correctly modified by your shader
3) if it still works, turn off the shader, only use the vbo, check whether that is visible

That's pretty much guaranteed to tell you where the problem is. Poor man's debugger

Share this post


Link to post
Share on other sites
Thanks alot for the debugging techniques everyone. I was actually unaware that you could use shaders and just specify the vertices without using a VBO. So, I stripped away the vbo, loaded my matrices into GL_PROJECTION and GL_MODELVIEW, and used ftransform() -- it worked. Then, as soon as I try to use my matrix uniforms and multiply the vertices by the matrices, instead of using ftransform(), I just get a black screen. If anyone has an idea to why this is happening, please let me know.

I load my shaders, link the shader program and run it before specifying my uniforms. Below is the function I use to add uniforms to my shader program.


bool ShaderManager::addUniform(enum UniformType uniformType, const string& uniformName, const void* uniformValue) {
unsigned int uniformLocation = glGetUniformLocation(_shaderProgramID, uniformName.c_str());

switch (uniformType) {
case MATRIX4:
glUniformMatrix4fv(uniformLocation, 1, false, (float*)uniformValue);
break;
default:
return(false);
}

return(true);
}



This is what I'm using as function arguments:

_shaderManager.addUniform(MATRIX4, "projectionMatrix", _projectionMatrix.getArray());
_shaderManager.addUniform(MATRIX4, "modelviewMatrix", _modelviewMatrix.getArray());

Share this post


Link to post
Share on other sites
When you stopped using VBO, did you remember to replace your "in_vertex" with the "gl_Vertex" value? If you're not using a vertex buffer then you cannot specify arbitrary attributes like "in_vertex" anymore.


Your matrix code looks fine, but you could swap them for the predefined values "gl_ProjectionMatrix" and "gl_ModelViewMatrix" if you want to try without using uniforms, these will take values from the built in openGL matrixes similar to ftransform(). Just one more way to try to isolate the issue.

Share this post


Link to post
Share on other sites
fragment shader:


#version 130

out vec4 fragColor;

void main () {

fragColor = vec4(0.8, 0.0, 0.6, 1.0);

}



this must not cause any problem because its just deprecated, not removed.

how are you passing vec3 into vec4? :S it must cause the problem



#version 130

uniform mat4 projectionMatrix;
unitorm mat4 modelviewMatrix;

in vec3 in_vertex; //VEC "3" here not vec4

void main () {

gl_Position = projectionMatrix * modelviewMatrix * vec4(in_vertex, 1.0);

}



Share this post


Link to post
Share on other sites
Quote:
Original post by karwosts
When you stopped using VBO, did you remember to replace your "in_vertex" with the "gl_Vertex" value? If you're not using a vertex buffer then you cannot specify arbitrary attributes like "in_vertex" anymore.


Your matrix code looks fine, but you could swap them for the predefined values "gl_ProjectionMatrix" and "gl_ModelViewMatrix" if you want to try without using uniforms, these will take values from the built in openGL matrixes similar to ftransform(). Just one more way to try to isolate the issue.




Are you sure? Because everything worked fine when I removed the VBOs but used ftransform(). This is the code I used to replace my VBOs:


glBegin(GL_TRIANGLES);
glVertexAttrib3d(0, 0.0, 32.0, -128.0);
glVertexAttrib3d(0, -32.0, -32.0, -128.0);
glVertexAttrib3d(0, 32.0, -32.0, -128.0);
glEnd();

glEnableVertexAttribArray(0);




I'm going to replace the doubles with floats and report back.

Share this post


Link to post
Share on other sites
OK! Thank you everyone! The problem turned out to be that my matrices were all defined using doubles instead of floats. I assumed they would get cast to floats but I guess not.

Kasya: I read that defining the vertex as a vec4 like that is fine, the 4th value will automatically get assigned a value of 1.0;

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!