Shaders causing problems.

Started by
2 comments, last by karwosts 13 years, 1 month ago
So I wanted to move my OpenGL version up from what it was before I continued my project I'm now creating a 3.1 context. This meant changing "varying" and "attribute" to "in" and "out" and removing those ugly looking gl_FragData[]'s from the shaders, and generally changing a lot of things. But its something I wanted to do...

Anyhow, upgrading caused nearly everything to stop working. I gradually fixed things but I still can't get my shaders to work. To test things with the least possible influences I'm trying to render a fullscreen quad with a solid color, but with a shader so that I can make sure they're functioning properly.

I'm rendering the quad with a FBO,IBO,VAO combo. The vertices have a position, normal (used for position reconstruction from depth later) in what the quad is really for, and a texture coordinate. Position is Vertex Attribute 0, Texture Coords are vertex attribute 1, and the normals are vertex attribute 2.

When I render the quad without the shader activated a white fullscreen quad appears, which seems to me the geometry is in the correct place, but just in case, here's how I render the geometry.



glBindVertexArray (VAOName); //Bind the VAO

glBindBuffer (GL_ARRAY_BUFFER,VBName); //Bind the VBO

glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,IBName); //Bind the IBO

glDrawElements (GL_TRIANGLES,TriangleCount*3,GL_UNSIGNED_SHORT,0); //DRAW IT ALL!

glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,0); //Unbind the IBO

glBindBuffer (GL_ARRAY_BUFFER,0); //Unbind the VBO

glBindVertexArray (0); //Unbind the VAO




So I'm guessing its either the GLSL itself or how I'm using the shaders.

Here's the Frag Shader



#version 130
precision highp float;

uniform sampler2D Texture;

in vec4 Position;
in vec2 TextureCoordinate;

out vec4 FragColor;

void main ()
{
FragColor = vec4(1,1,1,1);//texture2D (Texture,TextureCoordinate);
}



Heres the Vert Shader



#version 130
precision highp float;

in vec4 InPosition;
in vec2 InTextureCoordinate;
in vec3 InNormal;


out vec4 Position;
out vec3 FrustrumRay;
out vec2 TextureCoordinate;

void main()
{
TextureCoordinate = InTextureCoordinate;
Position = InPosition; //Its a fullscreen quad so there is no reason to project, so I don't multiply by anything
FrustrumRay = InNormal;
}



My shaders compile fine, but I'm guessing the problem is in the shader program...

Before linking I explicitly bind the FragColor output to 0 with glBindFragDataLocation () even though this happens automatically.
Also before linking I set my attribute locations with glBindAttribLocation even though they should be indexed appropriately because of the order I define them in shader.

The shader links without errors, yet the quad is not displayed. I've been stuck on this for several days and have checked wikis, forum posts, and various other sources. I feel like I'm missing something simple but can't it. Anyhelp would be appreciated.

I've included pastebin links to the actual source files in case the problem is hidden somewhere in my stupid code, feel free to insult its horribleness.

ShaderProgram.cpp http://pastebin.com/XAHWeW9K
ShaderCode.cpp http://pastebin.com/kMMBNV0g
ObjectBuffer.cpp http://pastebin.com/Yshez9DV
FullScreenQuad.cpp http://pastebin.com/saZAkzMA


*GLErrorCheck is a macro that checks glGetError () and outputs the file, line number and error type.


//Actual usage in the render loop

FullScreenProgram->BeginUse ();
//FullScreenProgram->SetTexture (ScreenTex,"Texture");
ScreenQuad.RenderGeometry ();
FullScreenProgram->EndUse ();



Again any help would be amazing because I am officially stumped.
Advertisement
Don't you still have to assign a position to gl_Position in the vertex shader? Or otherwise how does opengl know that "Position" is supposed to be your vertex position, as opposed to any other arbitrary varying variable. You're assigning to "Position" but you don't do anything with it.

Also:

glBindVertexArray (VAOName);
//glBindBuffer (GL_ARRAY_BUFFER,VBName); <--- USELESS
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,IBName);
glDrawElements (GL_TRIANGLES,TriangleCount*3,GL_UNSIGNED_SHORT,0); //DRAW IT ALL!
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,0);
//glBindBuffer (GL_ARRAY_BUFFER,0); <--- USELESS
glBindVertexArray (0);


You don't need to bind the vertex buffer to draw from it, once you set the pointer opengl doesn't care what buffer is bound, this is why you're allowed to draw with attributes from multiple buffers. These two lines I've marked don't do anything for you.
[size=2]My Projects:
[size=2]Portfolio Map for Android - Free Visual Portfolio Tracker
[size=2]Electron Flux for Android - Free Puzzle/Logic Game
Ah!

That worked, it was something stupid simple. I naively assumed that I was supposed to remove all the gl_stuff from my shader's but didn't think about how it would know which one was the output for the position of each vertex. I find it strange though that it uses gl_Position because with fragment data you can assign user defined variables for the outputs. I'd think it would be the same across the board.

Thankyou so much.
gl_Position is still required because it ties into fixed-function hardware for position data, unlike the fragment shader output, which is flexible (you can reassign the fragment outputs).
[size=2]My Projects:
[size=2]Portfolio Map for Android - Free Visual Portfolio Tracker
[size=2]Electron Flux for Android - Free Puzzle/Logic Game

This topic is closed to new replies.

Advertisement