GLSL 1.5 and Sending Vertices

Started by
11 comments, last by Mariusz Pilipczuk 10 years, 9 months ago

Hello.

In my old game, I used GLSL 1.2 in my shaders, and I could still call glColor*(), glTexCoord*(), etc. to pass info about vertices, and my vertex shader was still able to read it.

Now I'm trying to use GLSL 1.5, and I know that I cannot use glColor*() etc to pass such information any more, but it doesn't matter since I can use vertex attributes. However, gl_Vertex was also removed. This means that calling glVertex*() does not make sense anymore, so how do I send the vertices to the GPU?

Do I have to use VBOs? If so, is there any good tutorial with a nice example on how to do this?

Thank you.

Advertisement

I'm not up to date with what is in which version of GLSL, but from a modern perspective, everything has moved into generic vertex attributes and replaced specific vertex attributes. Specific functions like glVertex, glColor, glTexCoord, and more specifically their vertex array equivalents glVertexPointer, glColorPointer, and so on, have all been replaced by generic functions like glVertexAttribPointer.

The corresponding attributes have been removed from GLSL as well, and been replaced by your own attributes. You no longer use glVertex to get the position into the gl_Vertex variable, but instead you create your own names and bind the generic attributes to the inputs to your vertex shader.

I suggest you go through some of the links on the forum front page to study the new approach in the modern API. If you don't want to entirely migrate to the enw API, then at least you will learn about the generic attributes. The main idea is that you no longer use specific attributes, but generic attributes. For example, in the modern API you are required to use vertex arrays with VBO, but if you're not migrating entirely to the new API then you can still use vertex arrays without VBO.

Thank you, but I cannot seem to find any tutorial which shows how to use VBOs (yes, I'm trying to completely migrate to the new API). I did see somewhere on this forum that a person made a struct for holding each vertex and I believe this struct was treated as the only attribute. If I want to use this approach, can someone please post some example code for how I would create a VBO with this data and then draw? Thank you.

A quick search for "vbo tutorial" gives me plenty of results. The link collection I mentioned also contains information on how to use VBO.

You can still use glBegin/glVertexAttrib/glEnd if you wish, with "glVertexAttrib (0, ..." being the "moral equivalent" of glVertex. glBegin/glEnd are deprecated in GL3.x, but so long as you create a compatibility context (which you'll get by default) you'll be OK.

Of course in the long run you should switch over to VBOs, but for quick prototyping and easing the transition there's nothing wrong with this approach.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Thank you for helping guys. I attempted to use VBOs, but something doesn't work. I was attempting to (for now) draw a red triangle in orthographic mode on the screen. Here's the buffer creating code:


    int xValues[3] = {100, 40, 140};
    int yValues[3] = {50, 70, 70};

    GLuint vao;
    glGenVertexArrays(1, &vao);
    glBindVertexArray(vao);

    GLuint vbo[2];
    glGenBuffers(2, vbo);
    glBindBuffer(GL_ARRAY_BUFFER, vbo[0]);
    glBufferData(GL_ARRAY_BUFFER, 3 * sizeof(int), xValues, GL_STATIC_DRAW);
    glVertexAttribPointer((GLuint)0, 1, GL_INT, GL_FALSE, 0, 0);
    glBindBuffer(GL_ARRAY_BUFFER, vbo[1]);
    glBufferData(GL_ARRAY_BUFFER, 3 * sizeof(int), yValues, GL_STATIC_DRAW);
    glVertexAttribPointer((GLuint)1, 1, GL_INT, GL_FALSE, 0, 0);
    glEnableVertexAttribArray(0);
    glEnableVertexAttribArray(1);
    glBindVertexArray(0);

Then I draw as follows:


        glBindVertexArray(vao);
        glDrawArrays(GL_TRIANGLES, 0, 1);
        glBindVertexArray(0);

My vertex shader:


// gui.vert
// The GUI vertex shader.
// This software, including the source code, is release under the terms of
// the License.

#version 150

// vertex input
in int in_x;
in int in_y;

void main()
{
	float x = (float(in_x) / 320) - 1;
	float y = (float(in_y) / 240) - 1;
	gl_Position = vec4(x, y, 0.0, 1.0);
};

My fragment shader:


// gui.frag
// The GUI fragment shader.
// This software, including the source code, is release under the terms of
// the License.

#version 150

void main()
{
	gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);
};

But the only thing I get is a black screen. Does anyone know what I'm doing wrong here?

The last parameter to glDrawArrays is the number of vertices to pull from the arrays; to draw a triangle you need three vertices.

Another comment though, but not an error, is that you don't have to keep your coordinates in separate component-arrays like that. Make one array of coordinates, instead of two arrays with X and Y coordinates individually, and bind it to a vec2 (or ivec3 if you want integer vectors) type in the shader instead.

The last parameter to glDrawArrays is the number of vertices to pull from the arrays; to draw a triangle you need three vertices.

Another comment though, but not an error, is that you don't have to keep your coordinates in separate component-arrays like that. Make one array of coordinates, instead of two arrays with X and Y coordinates individually, and bind it to a vec2 (or ivec3 if you want integer vectors) type in the shader instead.

I know about the coordinates. I just wanted to see if I'm sending more than one attribute properly.

I changed the last argument of glDrawArrays() to 3, but it still draws nothing :(

You should do error checking then to see if something is wrong. And also query the shader for the actual locations so you don't hard code the wrong attribute locations. One thing that just occurred to me now though is that you may have to set the pointer to integer type attributes with glVertexAttribIPointer instead.

Pretty sure gl_FragColor is deprecated in glsl version 150.

This topic is closed to new replies.

Advertisement