Jump to content
  • Advertisement
Sign in to follow this  
?????????

OpenGL [SOLVED] Generic Vertex Attribute

This topic is 2527 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts


EDIT: Problem partially resolved, see post #8.
Hello,
I'm beginning OpenGL 4.2 in C# using the OpenTK binding (I'm aware that OpenTK binds only up to 3.2, so they have deprecated stuff available so I'm not using those functions, ya know... forward compatibility cool.gif). Anyways, so I have got this basic pass-through vertex shader: (also embedded in program below)
[source lang="c"]
#version 410
#pragma debug(on)

uniform mat4 projection;
uniform mat4 view;

vec3 inVertexPos;

void main()
{
gl_Position = vec4(inVertexPos, 1.0) * projection * view;
}
[/source]
The problem I am having is marked on line 48 (generic vertex attribute location equals -1) below.

I can't seem to understand the reason for the error (as OpenGL does not really throw an error...), [s]the OpenGL reference suggets it has to do with maximum generic vertex attribs; ....but it clearly doesn't.[/s]
Anyone can shed some light on the problem? I'll be most thankful.
(The entire code-file is attached as a file to the post as well)
The main program (this is a class constructor, after this function returns the main loop runs calling the render function below):

[source lang="csharp"]
Matrix4 Projection;
Matrix4 View;

int[] ShaderIDs;
int[] ProgramIDs;
uint[] VAOs;
uint[] VBOs;[font="arial, verdana, tahoma, sans-serif"]
ushort[] Indices; // Cube indices
Vector3[] Vertices; // Cube vertices

public GameCtor()
{
[font="arial, verdana, tahoma, sans-serif"] [/font]// Our uniform matrices
[font="arial, verdana, tahoma, sans-serif"] [/font]Projection = Matrix4.CreatePerspectiveOffCenter(-Width / 2, Width / 2, -Height / 2, Height / 2, 0.01f, 500); // Perspective with 0,0 in the center and same as window width and height
[font="arial, verdana, tahoma, sans-serif"] [/font]View = Matrix4.LookAt(0, 0, -100, 0, 0, 0, 0, 1, 0); // camera at (z = -100) looking at 0,0,0

[/font][font="arial, verdana, tahoma, sans-serif"]
VAOs = new uint[1];
VBOs = new uint[2];
ShaderIDs = new int[2]; // Reserve space for 2 shaders, here we use one vertex shader
ProgramIDs = new int[1]; // Our shader program[/font]
[font="arial, verdana, tahoma, sans-serif"]
GL.GenVertexArrays(VAOs.Length, VAOs);[/font]
GL.GenBuffers(VBOs.Length, VBOs);
InitVAOsVBOs();
InitVertexShader(); // Initialize our vertex shader

// Create a program and attach our first shader
ProgramIDs[0] = GL.CreateProgram();
GL.AttachShader(ProgramIDs[0], ShaderIDs[0]);

// Link and print result log
GL.LinkProgram(ProgramIDs[0]);
Console.WriteLine(GL.GetProgramInfoLog(ProgramIDs[0]));

// Start using our shader program
// Required before setting Uniforms,
// otherwise I get an error: InvalidValue
GL.UseProgram(ProgramIDs[0]);

// Set uniform projection and view matrices
int uProjIndex = GL.GetUniformLocation(ProgramIDs[0], "projection");
int uViewIndex = GL.GetUniformLocation(ProgramIDs[0], "view");
GL.UniformMatrix4(uProjIndex, true, ref Projection);
GL.UniformMatrix4(uViewIndex, true, ref View);

// Vertex Position attribute: vec3 inVertexPos;
int inVertexPosIndex = GL.GetAttribLocation(ProgramIDs[0], "inVertexPos"); // returns -1
GL.BindVertexArray(VAOs[0]);
GL.VertexAttribPointer(inVertexPosIndex, 3, VertexAttribPointerType.Float, false, 0, IntPtr.Zero);
GL.EnableVertexAttribArray(inVertexPosIndex);}
[/source]
Buffer initialization:
[source lang="csharp"]

private void InitVAOsVBOs()
{
Vertices = new Vector3[8] {
new Vector3(-80, 80, -80), //0 Top, Left, Front
new Vector3(-80, 80, 80), //1 Top, Left, Back
new Vector3(-80, -80, -80), //2 Top, Right, Front
new Vector3(-80, -80, 80), //3 Top, Right, Back
new Vector3( 80, 80, 80), //4 Bottom, Left, Front
new Vector3( 80, 80, -80), //5 Bottom, Left, Back
new Vector3( 80, -80, -80), //6 Bottom, Right, Front
new Vector3( 80, -80, 80) //7 Bottom, Right, Back
};
GL.BindBuffer(BufferTarget.ArrayBuffer, VBOs[0]);
GL.BufferData(BufferTarget.ArrayBuffer, (IntPtr)(Vertices.Length * Vector3.SizeInBytes), Vertices, BufferUsageHint.StaticDraw);
GL.BindVertexArray(VAOs[0]);

Indices = new ushort[24] {
0, 1, 3, 2, // Top face
0, 1, 5, 4, // Left face
1, 3, 7, 5, // Back face
2, 3, 7, 6, // Right face
4, 5, 7, 6, // Bottom face
0, 2, 6, 4 // Front face
};
GL.BindBuffer(BufferTarget.ElementArrayBuffer, VBOs[0]);
GL.BufferData(BufferTarget.ElementArrayBuffer, (IntPtr)(Indices.Length * sizeof(ushort)), Indices, BufferUsageHint.StaticDraw);
}[/source]
Vertex Shader initialization:
[source lang="csharp"]
private void InitVertexShader()
{
string VertexShaderSource = // Yes, this is inline shader source
@"#version 410
#pragma debug(on)

uniform mat4 projection;
uniform mat4 view;

vec3 inVertexPos;

void main()
{
gl_Position = vec4(inVertexPos, 1.0) * projection * view;
}";
// Create our vertex shader, upload our source, and compile
ShaderIDs[0] = GL.CreateShader(ShaderType.VertexShader);
GL.ShaderSource(ShaderIDs[0], VertexShaderSource);
GL.CompileShader(ShaderIDs[0]);
Console.WriteLine(GL.GetShaderInfoLog(ShaderIDs[0])); // check the log to make sure it succeeds
}
[/source]
Render code:
[source lang="csharp"]
protected override void OnRenderFrame(FrameEventArgs e)
{
GL.Clear(ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit);

// Draw using the indices buffer
GL.DrawElements(BeginMode.Quads, Indices.Length, DrawElementsType.UnsignedShort, 0);

this.SwapBuffers();
}
[/source]

Share this post


Link to post
Share on other sites
Advertisement
Have you verified that inVertexPosIndex returns something other than -1? If you get that and pass it to AttribPointer it will probably fail.

I think you're missing the "in" keyword in your shader for this variable (unless this was removed in a more recent version than I'm familiar with).

"vec3 inVertexPos;"

Share this post


Link to post
Share on other sites
[s]yes, I checked that the location returned for the variable is always 1.
the 4.1 reference sheet says that if the "in" is omitted then it is implied.[/s]
[s]
[/s]
EDIT: The location returned is -1.
EDIT2: Fixed by adding the in keyword, as the OpenTK binding does not support higher than 1.5 GLSL. Edited by wiz3kid

Share this post


Link to post
Share on other sites

Do you have a VAO bound?

[s]See the InitBuffers() function. AFAIK, VAOs are deprecated since 3.3... and are removed since 4.0. VBO's are the way to go.[/s]
[s]
[/s]
EDIT: Now I do. smile.gif Edited by wiz3kid

Share this post


Link to post
Share on other sites
VAO is in glspec33.core.20100311.pdf
glspec40.core.20100311.pdf
glspec41.core.20100725.pdf

You are forced to use them.

Share this post


Link to post
Share on other sites

VAO is in glspec33.core.20100311.pdf
glspec40.core.20100311.pdf
glspec41.core.20100725.pdf

You are forced to use them.


[s]Oh, I thought VAOs were the gl*Pointer family of functions....
So, I'm supposed to use glGenVertexArrays() instead of glGenBuffers()?[/s]

EDIT: Read up on it a bit, and I don't understand where the VAO gets its data from. I understood that I have to use both buffers and VAOs to achieve what I want , right? Edited by wiz3kid

Share this post


Link to post
Share on other sites
Problem partially resolved! No errors anymore, using gDEBugger I see that it's rendering 24 vertices each frame.
Now the next problem, I don't see anything on the screen. New code attached below.
I added a fragment shader which sets all the fragments to white color.
I've also attached a zip containing a binary for anyone interested.

[attachment=5009:Game1 - Copy.cs.txt]

[attachment=5010:ShadersAndPrograms.zip]

Share this post


Link to post
Share on other sites
I think
"gl_Position = vec4(inVertexPos, 1.0) * projection * view;"

should be
"gl_Position = projection * view * vec4(inVertexPos, 1.0);"

As you want to apply the view matrix to transform the vertex position into eye space,
and then project it into clipping space by multiplying with the projection matrix.

Share this post


Link to post
Share on other sites

I think
"gl_Position = vec4(inVertexPos, 1.0) * projection * view;"

should be
"gl_Position = projection * view * vec4(inVertexPos, 1.0);"

As you want to apply the view matrix to transform the vertex position into eye space,
and then project it into clipping space by multiplying with the projection matrix.



Still not working....

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!