Advertisement Jump to content
Sign in to follow this  

OpenGL Buffer / Shader Issues [C# OpenTK]

This topic is 2593 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Preface: I was happily working in OpenGL 3.3 until not enough laptops at the place I work at could support OpenGL 3.3. So I wanted to convert my code to work with OpenGL 2.0 which most of the laptops at my work support.

So I basically created a "Capabilities" class with a bunch of static flags to indicate what the context could and couldn't do. Then went through and added branching for certain capabilities. Whether or not VAOs work, instancing with glDrawElementsInstanced or Pseudo-instancing, glGenerateMipMaps... etc etc)

When I ran my monstrosity it turned all my geometry into soup.

So I took everything out except for the simplest shader / mesh I had. Which was a really thin quad used like a line to help me test ray picking.

It too doesn't look like a line.


It does the weird change shape as I move the camera around thing I've seen before so I'm suspecting the problem to be in the realm of the shader.

Just in case I read the vertices from the VBO back in and displayed them, they were nice and line like.

VBO Dump:
<4.960452, 4.932934, 4.933001>
<1.110205, -1.596217, -1.589712>
<4.960452, 4.933934, 4.933001>
<1.110205, -1.595217, -1.589712>

So I also outputed the shader program information

Shader Program 51
Uniform 0: Color @ 0
Uniform 1: Projection @ 1
Uniform 2: View @ 2
Uniform 3: World @ 3
Vertex Attributes:
0: FloatVec4 InPosition

The uniforms are there and the attributes are there, so the shader builds fine.

I render said line like this.

Shaders.SolidShader.Begin(); //Just calls GL.UseProgram
Shaders.SolidShader["View"].Set(cam.View); //Calls GL.UniformMatrix4(); on the correct uniform
Shaders.SolidShader["Color"].Set(color); //Really just a vec4
LineBuffer.RenderGeometry(); //Where I think the culprit is.
Shaders.SolidShader.End(); //uses program 0

I wrapped it up like XNA's Effect class to make shaders a bit more usable.

RenderGeometry looks like this...

public void RenderGeometry()
if (!Capabilities.CanUseVAO)
BindVBO(); //GL.BindBuffer(BufferTarget.ArrayBuffer, ID);
BindIBO(); //GL.BindBuffer(BufferTarget.ElementArrayBuffer, ID);
Format.UseFormat(); //See below
Format.Bind(); //Binds the VAO

GL.DrawElements(BeginMode.Triangles, TriangleCount*3, DrawElementsType.UnsignedShort, 0);

if (!Capabilities.CanUseVAO)

I'd swear I only need "TriangleCount" and not "TriangleCount*3" but it never rendered the whole geometry without it.

And finally UseFormat, kinda a monster. But I basically cloned the FVF bitfield idea from DirectX since it was a rather quick way to express the very few vertex types I use. I've stripped out the important part, the actual OpenGL calls.

int Offset = 0;//Our data is interleaved so we keep track of the starting offset
if ((format & VertexFormat.Position) == VertexFormat.Position)
GL.VertexAttribPointer(0, 3, VertexAttribPointerType.Float, false, LastVertexSize, 0);
GL.EnableVertexAttribArray(0); //Position is attrib 0 (Shader usage maps "InPosition" to this attrib)
Offset += 12; //3x4 bytes
if ((format & VertexFormat.Texture) == VertexFormat.Texture)
GL.VertexAttribPointer(1, 2, VertexAttribPointerType.Float, false, LastVertexSize, Offset);
GL.EnableVertexAttribArray(1); //Texturecoordinates are attrib 1
Offset += 8;//2x4 bytes
if ((format & VertexFormat.Normal) == VertexFormat.Normal)
GL.VertexAttribPointer(2, 3, VertexAttribPointerType.Float, false, LastVertexSize, Offset);
GL.EnableVertexAttribArray(2); //Normals are attrib 2
Offset += 12;//3x4 bytes
if ((format & VertexFormat.Tangent) == VertexFormat.Tangent) //These are for normal mapping but I don't use them yet
GL.VertexAttribPointer(3, 3, VertexAttribPointerType.Float, false, LastVertexSize, Offset);
Offset += 12;
if ((format & VertexFormat.BiTangent) == VertexFormat.BiTangent)
GL.VertexAttribPointer(4, 3, VertexAttribPointerType.Float, false, LastVertexSize, Offset);
Offset += 12;

If you didn't TL;DR; and still have your retinas intact from looking at my [s]code [/s]spaghetti. I'd love some help because I'm stumped.

Share this post

Link to post
Share on other sites
I have a feeling that your problem lies in the VBO coordinates. Quad vertices must lie on a single plane and should form a convex polygon, i.e.

1 2

4 3


1 2

3 4

Otherwise, results are undefined.

(Btw, quads are deprecated in GL3.x).

Share this post

Link to post
Share on other sites
Hmm good thought, I however miss-typed. In my head I think of it as a quad however its rendered with two triangles.

It also occurred to me I forgot to post the actual shader code, the error could be in there.

I basically went through and replaced "in" and "out" with "varying" and "attribute" as needed and made use of gl_FragColor instead of my custom FragColor variable.

Vertex Shader:

uniform mat4 World;
uniform mat4 View;
uniform mat4 Projection;

attribute vec4 InPosition;
attribute vec2 InTextureCoordinate;
attribute vec3 InNormal;

varying vec3 Normal;
varying vec2 TextureCoordinate;
varying vec4 FragPos;

void main ()
TextureCoordinate = InTextureCoordinate;
gl_Position = (Projection * View * World) * InPosition;
FragPos = World * InPosition;
Normal = mat3 (World) * InNormal;

Fragment Shader:

uniform vec4 Color;

void main ()
gl_FragColor = Color;

The extra stuff in the vertex shader is used for various other fragment shaders.

The working OpenGL 3.x shaders are at these codepad links:

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!