Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 09 Jun 2013
Offline Last Active Apr 29 2016 04:53 AM

#5182836 Help: GL_QUADS Explanation

Posted by on 25 September 2014 - 02:49 AM

do you have backface culling activated?
Would look like:


GL_BACK is the initial value, therefore it dont have to appear in your code, but it would be good to know if you use "glEnable(GL_CULLFACE)".
I am asking this because even the second order of vertices should not give you the right result if backface culling is enabled and


is not used. glFrontFace sets the winding order of vertices in OpenGL and its default is GL_CCW. Therefore by default the order of vertices should be counter clockwise.
In your second example they are clockwise, which would mean that the quad is oriented away from the screen(viewer) and if backface culling is enabled you should not see anything. (backface culling means that only the front face is rendered and visible)
I think the right order should be as an example:


every other counter clockwise order is fine too.
Even if the quad is divided (which is the only way in modern opengl, where GL_QUADS is deprecated and deleted) in two triangles botth of them (0,1,2) and (2,3,0) have the right winding.

I have no clue why the first result is looking like this, but I dont found a specification how exactly GL_QUADS works.

What I would do is enable backface culling and then try it, because then you are learning it the right way and you dont have to struggle again when you need to use backface culling. And like DiegoSLTS suggests I would use triangles, because this is the way to go in OpenGL 3.0 and higher.

I know that were perhaps alot of information and I did not explain everything in depth, please feel free to ask.

#5150870 Major Rendering Problem - OpenGL and the Oculus Rift

Posted by on 02 May 2014 - 01:19 AM

I just took a glimpse into the code, first thing that attracted me was

int l_Major = glfwGetWindowAttrib(l_Window, GLFW_CONTEXT_VERSION_MAJOR);
int l_Minor = glfwGetWindowAttrib(l_Window, GLFW_CONTEXT_VERSION_MINOR);
int l_Profile = glfwGetWindowAttrib(l_Window, GLFW_OPENGL_PROFILE);
printf("OpenGL: %d.%d ", l_Major, l_Minor);

followed by:


According to the screenshot you are initializing glfw with opengl 4.4 (or glfw does that for you, look into "glfwWindoHint (GLFW_CONTEXT_VERSION_MAJOR /GLFW_CONTEX_VERSION_MINOR, x)" to set it manually), but "glBegin(GL_QUADS)" is removed since 3.3. I am not completly sure why it display something at all. But I would try to either a OpenGL Version where glBegin is not deprecated or removed or alter the program so that it is written in an OpenGL 3.3 (or newer) way.

Another thing is


I think there might be one ";" too many
the else case is not evaluated at all, otherwise you should have either one of those prints in your console. And it shows that the profile is not "GLFW_OPENGL_COMPAT_PROFILE".

I am not sure that this is the main problem but it is worth to give it a try.
Hope that this helps a little bit, otherwise please ask.

#5090355 Confused about OpenGL 3.0+ shaders

Posted by on 30 August 2013 - 06:01 AM


Drawing is done by VBOs only (my hardware doesn't support VAOs AFAIK), in the usual way as far as I can see.


Every frame, after drawing:

        int projectionMatrixLocation = glGetUniformLocation(programShaderID, "projectionMatrix"); // Get the location of our projection matrix in the shader  
        int viewMatrixLocation = glGetUniformLocation(programShaderID, "viewMatrix"); // Get the location of our view matrix in the shader  
        int modelMatrixLocation = glGetUniformLocation(programShaderID, "modelMatrix"); // Get the location of our model matrix in the shader

        glUniformMatrix4fv(projectionMatrixLocation, 1, GL_FALSE, &projectionMatrix[0][0]); // Send our projection matrix to the shader  
        glUniformMatrix4fv(viewMatrixLocation, 1, GL_FALSE, &viewMatrix[0][0]); // Send our view matrix to the shader  
        glUniformMatrix4fv(modelMatrixLocation, 1, GL_FALSE, &modelMatrix[0][0]); // Send our model matrix to the shader  


Here is one problem,

I guess you first do something like "glUseProgram(programShaderId)", and at the end of your drawing "glUseProgram(0)"? Or do you just use one shader? Then setting it once in the initialisation is ok.
When you set the uniforms after drawing, the shaders dont have the matrices when they draw/process everything.or if the shader is active all the time then you draw everything with the matrices from the last draw step.
And you dont need to use "glGetUniformLocation()" every frame because the locations are constant for one shader, just get the locations once after "loadShaders ("vertexShader.txt","fragmentShader.txt"); "
The right sequence would be:



  1. Load shaders
  2. glGetUniformLocation

 Each frame:

  1. set uniforms 
  2. draw your vbo

The other stuff you are doing should be right, so this sequence only considers the uniform stuff.

A good resource for me is http://www.opengl-tutorial.org/ 
Feel free to ask if it is still not working or when there are other questions

Best regards