Sign in to follow this  
Tocs1001

OpenGL Shaders causing problems.

Recommended Posts

So I wanted to move my OpenGL version up from what it was before I continued my project I'm now creating a 3.1 context. This meant changing "varying" and "attribute" to "in" and "out" and removing those ugly looking gl_FragData[]'s from the shaders, and generally changing a lot of things. But its something I wanted to do...

Anyhow, upgrading caused nearly everything to stop working. I gradually fixed things but I still can't get my shaders to work. To test things with the least possible influences I'm trying to render a fullscreen quad with a solid color, but with a shader so that I can make sure they're functioning properly.

I'm rendering the quad with a FBO,IBO,VAO combo. The vertices have a position, normal (used for position reconstruction from depth later) in what the quad is really for, and a texture coordinate. Position is Vertex Attribute 0, Texture Coords are vertex attribute 1, and the normals are vertex attribute 2.

When I render the quad without the shader activated a white fullscreen quad appears, which seems to me the geometry is in the correct place, but just in case, here's how I render the geometry.

[code]

glBindVertexArray (VAOName); //Bind the VAO

glBindBuffer (GL_ARRAY_BUFFER,VBName); //Bind the VBO

glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,IBName); //Bind the IBO

glDrawElements (GL_TRIANGLES,TriangleCount*3,GL_UNSIGNED_SHORT,0); //DRAW IT ALL!

glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,0); //Unbind the IBO

glBindBuffer (GL_ARRAY_BUFFER,0); //Unbind the VBO

glBindVertexArray (0); //Unbind the VAO

[/code]


So I'm guessing its either the GLSL itself or how I'm using the shaders.

Here's the Frag Shader

[code]

#version 130
precision highp float;

uniform sampler2D Texture;

in vec4 Position;
in vec2 TextureCoordinate;

out vec4 FragColor;

void main ()
{
FragColor = vec4(1,1,1,1);//texture2D (Texture,TextureCoordinate);
}
[/code]


Heres the Vert Shader

[code]

#version 130
precision highp float;

in vec4 InPosition;
in vec2 InTextureCoordinate;
in vec3 InNormal;


out vec4 Position;
out vec3 FrustrumRay;
out vec2 TextureCoordinate;

void main()
{
TextureCoordinate = InTextureCoordinate;
Position = InPosition; //Its a fullscreen quad so there is no reason to project, so I don't multiply by anything
FrustrumRay = InNormal;
}
[/code]


My shaders compile fine, but I'm guessing the problem is in the shader program...

Before linking I explicitly bind the FragColor output to 0 with glBindFragDataLocation () even though this happens automatically.
Also before linking I set my attribute locations with glBindAttribLocation even though they should be indexed appropriately because of the order I define them in shader.

The shader links without errors, yet the quad is not displayed. I've been stuck on this for several days and have checked wikis, forum posts, and various other sources. I feel like I'm missing something simple but can't it. Anyhelp would be appreciated.

I've included pastebin links to the actual source files in case the problem is hidden somewhere in my stupid code, feel free to insult its horribleness.

ShaderProgram.cpp [url="http://pastebin.com/XAHWeW9K"]http://pastebin.com/XAHWeW9K[/url]
ShaderCode.cpp [url="http://pastebin.com/kMMBNV0g"]http://pastebin.com/kMMBNV0g[/url]
ObjectBuffer.cpp [url="http://pastebin.com/Yshez9DV"]http://pastebin.com/Yshez9DV[/url]
FullScreenQuad.cpp [url="http://pastebin.com/saZAkzMA"]http://pastebin.com/saZAkzMA[/url]


*GLErrorCheck is a macro that checks glGetError () and outputs the file, line number and error type.

[code]
//Actual usage in the render loop

FullScreenProgram->BeginUse ();
//FullScreenProgram->SetTexture (ScreenTex,"Texture");
ScreenQuad.RenderGeometry ();
FullScreenProgram->EndUse ();
[/code]


Again any help would be amazing because I am officially stumped.

Share this post


Link to post
Share on other sites
Don't you still have to assign a position to gl_Position in the vertex shader? Or otherwise how does opengl know that "Position" is supposed to be your vertex position, as opposed to any other arbitrary varying variable. You're assigning to "Position" but you don't do anything with it.

Also:
[code]
glBindVertexArray (VAOName);
//glBindBuffer (GL_ARRAY_BUFFER,VBName); <--- USELESS
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,IBName);
glDrawElements (GL_TRIANGLES,TriangleCount*3,GL_UNSIGNED_SHORT,0); //DRAW IT ALL!
glBindBuffer (GL_ELEMENT_ARRAY_BUFFER,0);
//glBindBuffer (GL_ARRAY_BUFFER,0); <--- USELESS
glBindVertexArray (0);
[/code]

You don't need to bind the vertex buffer to draw from it, once you set the pointer opengl doesn't care what buffer is bound, this is why you're allowed to draw with attributes from multiple buffers. These two lines I've marked don't do anything for you.

Share this post


Link to post
Share on other sites
Ah!

That worked, it was something stupid simple. I naively assumed that I was supposed to remove all the gl_stuff from my shader's but didn't think about how it would know which one was the output for the position of each vertex. I find it strange though that it uses gl_Position because with fragment data you can assign user defined variables for the outputs. I'd think it would be the same across the board.

Thankyou so much.

Share this post


Link to post
Share on other sites
gl_Position is still required because it ties into fixed-function hardware for position data, unlike the fragment shader output, which is flexible (you can reassign the fragment outputs).

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this