Jump to content

  • Log In with Google      Sign In   
  • Create Account


Trying to get rid of all the deprecated stuff


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
10 replies to this topic

#1 japro   Members   -  Reputation: 887

Like
0Likes
Like

Posted 25 August 2011 - 05:40 AM

I haven't done that much "modern opengl" stuff so far. So now I try to familiarize myself with the proper way to do things. But this is getting somewhat frustrating since searching only seems give me lots of tutorials that use all the stuff that is deprecated in ogl3 and lots of forum posts that tell me that it is deprecated... Searching for examples (you know REALLY basic ones) is also of no help, since everyone who does ogl3 examples seems to wrap away all the functionality in his own classes and whatnot. I don't want to know how you wrapped the stuff and dissect 3 helper classes, I just want the smallest working example, ARGH.

Well anyway. So I figured Id just use an example I had that works and replace stuff step by step. So the first thing I wanted to do is going to generic attributes for vertex data, normals and texture coordinates.

So I put my VBO in a "std::vector<Vertex> vertices"
[source lang="cpp"]struct Vertex { Vertex(const Vector<float, 3> &p, const Vector<float, 3> &n, const Vector<float, 2> &t) : pos(p), normal(n), texcoord(t) { } Vector<float, 3> pos; Vector<float, 3> normal; Vector<float, 2> texcoord;};[/source]
"build" it with
[source lang="cpp"] GLuint buffer; glGenBuffers(1, &buffer); glBindBuffer(GL_ARRAY_BUFFER, buffer); glBufferData(GL_ARRAY_BUFFER, vertices.size()*sizeof(Vertex), &(vertices[0]), GL_STATIC_DRAW); glVertexAttribPointer(0, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (char *)0); glVertexAttribPointer(1, 3, GL_FLOAT, GL_FALSE, sizeof(Vertex), (char *)0+12); glVertexAttribPointer(2, 2, GL_FLOAT, GL_FALSE, sizeof(Vertex), (char *)0+24);[/source]
and draw it with:
[source lang="cpp"] glBindBuffer(GL_ARRAY_BUFFER, buffer); glDrawArrays(GL_QUADS, 0, vertices.size());[/source]
And here is the whole setup for the shaders and such in compressed form:
[source lang="cpp"] glMatrixMode(GL_PROJECTION); glLoadIdentity(); glFrustum(-4.f, 4.f, -3.f, 3.f, 10.f, 1536.f); glClearColor(0.f, 0.f, 0.f, 0.f); GLuint vertex_shader; GLuint fragment_shader; vertex_shader = glCreateShader(GL_VERTEX_SHADER); fragment_shader = glCreateShader(GL_FRAGMENT_SHADER); std::string vertex_source = "#version 130\n" "in vec3 vertex;\n" "in vec3 normal;\n" "in vec2 tcoords;\n" "out vec4 vcolor;\n" "void main(void)\n" "{\n" " gl_Position = gl_ProjectionMatrix * gl_ModelViewMatrix * vec4(vertex,0);\n" " vcolor = gl_Color;\n" "}\n"; int vertex_length = vertex_source.size(); const char *vs = vertex_source.c_str(); glShaderSource(vertex_shader, 1, &vs, &vertex_length); std::string fragment_source = "#version 130\n" "in vec4 vcolor;\n" "void main (void)\n" "{\n" " gl_FragColor = vcolor;\n" "}\n"; int fragment_length = fragment_source.size(); const char *fs = fragment_source.c_str(); glShaderSource(fragment_shader, 1, &fs, &fragment_length); std::cout << "compiling vertex shader" << std::endl; glCompileShader(vertex_shader); print_log(vertex_shader); std::cout << "compiling fragment shader" << std::endl; glCompileShader(fragment_shader); print_log(fragment_shader); GLuint program = glCreateProgram(); glAttachShader(program, vertex_shader); glAttachShader(program, fragment_shader); glBindAttribLocation(program, 0, "vertex"); glBindAttribLocation(program, 1, "normal"); glBindAttribLocation(program, 2, "tcoords"); glLinkProgram(program); glUseProgram(program); glMatrixMode(GL_MODELVIEW); glLoadIdentity(); glTranslatef(0.f, 0.f, -180.f);[/source]

This only gives me a black screen. As far as I can tell all I did was replace the the glVertexPointer etc. stuff with glVertexAttribPointer, remove the "glEnableClientState" calls (or do I need something equivalent for my generic attributes?) and rename the stuff in the shaders...
Am I binding the attributes wrong? Do I forget something? Is the way i convert "vertex" inside the shader program to a vec4 correct?

Sponsor:

#2 Shintah   Members   -  Reputation: 109

Like
1Likes
Like

Posted 25 August 2011 - 06:06 AM

Just took a quick glance at the code but one thing I saw is that you don't enable the vertex attribs with glEnableVertexAttribArray.

#3 japro   Members   -  Reputation: 887

Like
0Likes
Like

Posted 25 August 2011 - 06:36 AM

Ok, I inserted those and apparantly have a problem with the way the color gets passed?

I feel I have some sort of grave misunderstand how the colors work.

I accidentially uncommented a line from a earlier version in the fragment shader so it read:

#version 130
in vec4 vcolor;
void main (void)
{   
   gl_FragColor = color*vcolor;  
}

this resulted in the whole geometry being painted in white. But why doesnt the use of "color" which isnt declared anywhere produce an error from the compiler? Also changing the line to "gl_FragColor = vec4(1,1,1,1);" gives me a black screen again? WTF

Edit: forget it, apparently I just don't see the error and the fragment shader is not used at all or something

Edit2: I'm now completely confused. Not using the shader program at all still paints the geometry in white. How does the pipline know how to use my "custom" attributes? Or is attribute 0 just getting interpreted as vertex data anyway?

#4 japro   Members   -  Reputation: 887

Like
0Likes
Like

Posted 25 August 2011 - 06:53 AM

Ok, so I "figured it out" and by that I mean that I works now and I don't know what actually was wrong...

#5 japro   Members   -  Reputation: 887

Like
0Likes
Like

Posted 25 August 2011 - 07:48 AM

Ok, one last question. gl_FragColor is deprecated and I'm supposed to declare my own "out" variable. So just put "out vec4 FragColor;" into the shader, used it and it magically works. But this confuses me, shouldn't I have to to something like glBindFragDataLocation(...) and then tell ogl somehow what it is supposed to do with "FragColor"? What happens If i declare multiple such outputs?

#6 Shintah   Members   -  Reputation: 109

Like
0Likes
Like

Posted 25 August 2011 - 08:22 AM

The GLSL spec just says:

Both gl_FragColor and gl_FragData are deprecated; the preferred usage is to explicitly declare these outputs in the fragment
shader using the out storage qualifier.


so I guess it's just magic. :P

Not sure what happens if you declare multiple outputs without using them. In the cases where you actually want more then one color output you have to specify this with glDrawBuffers and/or with layout qualifiers in the GLSL shader. Not entirely sure about all the details so check the GL documentation or glsl spec for more details.

#7 karwosts   Members   -  Reputation: 840

Like
0Likes
Like

Posted 25 August 2011 - 09:16 AM

But this confuses me, shouldn't I have to to something like glBindFragDataLocation(...) and then tell ogl somehow what it is supposed to do with "FragColor"? What happens If i declare multiple such outputs?


If it works anything like attributes, my guess is that it just randomly assigns your N outputs to locations 0..(N-1), with 0 being the one that's picked up by glDrawBuffers by default. If you only have one output it automatically goes to channel 0, so you don't have to call glBindFragDataLocation. Just a guess though.

Regarding all your other problems, it seems like you either have no or very poor error checking. I can't comment on what is in 'print_log' cause I can't see it, but I see that you're not doing any debugging of the program linking infolog, so you could be missing some problems there.

Brush up on the info here, it may help you.

http://www.lighthouse3d.com/tutorials/glsl-tutorial/troubleshooting-the-infolog/
My Projects:
Portfolio Map for Android - Free Visual Portfolio Tracker
Electron Flux for Android - Free Puzzle/Logic Game

#8 Jacob Jingle   Members   -  Reputation: 223

Like
1Likes
Like

Posted 26 August 2011 - 12:47 AM

I haven't done that much "modern opengl" stuff so far. So now I try to familiarize myself with the proper way to do things. But this is getting somewhat frustrating since searching only seems give me lots of tutorials that use all the stuff that is deprecated in ogl3 and lots of forum posts that tell me that it is deprecated... Searching for examples (you know REALLY basic ones) is also of no help, since everyone who does ogl3 examples seems to wrap away all the functionality in his own classes and whatnot.

IMHO, best samples on the web.
http://www.g-truc.net/

Great site:
http://www.opengl-tutorial.org/

Good online book:
Learning Modern 3D Graphics Programming

#9 foobarbazqux   Members   -  Reputation: 197

Like
2Likes
Like

Posted 26 August 2011 - 05:59 AM

GL_QUADS is deprecated, you probably want to use pairs of triangles instead.

#10 stalef   Members   -  Reputation: 433

Like
1Likes
Like

Posted 26 August 2011 - 06:15 AM

This site helped me alot when I also wanted to do more modern OpenGL:

http://nopper.tv/opengl.html

#11 Eric Lengyel   Crossbones+   -  Reputation: 2157

Like
0Likes
Like

Posted 27 August 2011 - 01:49 AM

GL_QUADS is deprecated, you probably want to use pairs of triangles instead.


This is one of the biggest mistakes ever made by the ARB. Quads are supported directly by the hardware, and there are good reasons to use them instead of pairs of triangles for various things.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS