Jump to content
  • Advertisement
Sign in to follow this  
blueshogun96

OpenGL Learning GLSL all over again

This topic is 2328 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Sorry to start another thread about GLSL, but right now, I'm getting confused with GLSL 150.  It looks like quite a bit has changed since the last time I ever touched GLSL.  

 

What I really have a hard time figuring out (from what few and obscure tutorials I find on the net) is how does OpenGL know what is the vertex or fragment colour?  There's no more gl_FragColour, and I've seen the output colour variable change so often.  I looked though the OpenGL code and I don't see anything that specifies what is the fragment colour.  This is really confusing and I haven't found much straight forward information either.  Almost everything uses GLSL 330 or above, which I can't use since I'm confined to OpenGL 3.2 on MacOSX.  Any help is appreciated.

 

Shogun.

 

EDIT: Let me explain what I'm talking about in greater detail.

 

See this vertex program?

 

#version 150
 
in  vec3 in_Position;
in  vec3 in_Color;
out vec3 ex_Color;
 
void main(void)
{
	ex_Color = in_Color;
	gl_Position = vec4(in_Position, 1.0);
}

 

The vertex colour is set in ex_Color.  I've seen this differ for other vertex programs.  in_Color is already bound, that I understand.

 

p = glCreateProgram();
    
	glBindAttribLocation(p,0, "in_Position");
	glBindAttribLocation(p,1, "in_Color");

 

But how does OpenGL know that ex_Color is the fragment colour?  Just because it has out vec3 in front of it as it's variable type?  How would it know the difference between other vertex attributes then???

 

The same with the fragment program...

 

#version 150
 
in  vec3 ex_Color;
out vec4 out_Color;
 
void main(void)
{
	out_Color = vec4(ex_Color,1.0);
}

 

This time ex_Color is the input colour.  How does OpenGL know the difference??  I'm sorry if this is really obvious, but IMO, this doesn't look very straight forward compared to older versions of GLSL.

Edited by blueshogun96

Share this post


Link to post
Share on other sites
Advertisement

I believe OpenGL handles all of this during glLinkProgram().  It will bind the inputs and outputs of the vertex and fragment shader... and complain (i.e. fail) if anything doesn't line up.  For older versions of GLSL, there was varying.  I believe it's the same concept... just new keywords.

 

The only output of a fragment shader is going to be a vec4, since you are writing a color (or value) to a render target.

Share this post


Link to post
Share on other sites

The only output of a fragment shader is going to be a vec4, since you are writing a color (or value) to a render target.

Also depth.

Share this post


Link to post
Share on other sites

The only output of a fragment shader is going to be a vec4, since you are writing a color (or value) to a render target.

Also depth.

 

And you can have multiple render targets.

 

But in this case, since there is only one vec4 output, and since (in general) outputting colour is something you will always be doing, the GLSL compiler is clever enough to figure out for itself that this vec4 output must be colour.  Personally I'm apprehensive about any kind of mysterious behind-the-scenes ill-defined magic, but that's the decision the ARB have made for GLSL.

 

However, and following a quick read through the GLSL 150 spec, gl_FragColor is still available for use but just marked as deprecated.  That doesn't mean that you can't use it - it just means that it may be removed in a future version.

Edited by mhagain

Share this post


Link to post
Share on other sites

It seems you are looking for glBindFragDataLocation, which tells glsl what output variable writes to which color target. By default, if only a single output is specified for the fragment shader, that variable is chosen as the output to the (first) bound color buffer.

 

Depth is a separate case, since only a single depth buffer can be bound at a time. No separate variable should be declared and it is not required to write to the depth manually. If you wish to do so anyway, you can use gl_FragDepth.

Share this post


Link to post
Share on other sites

Okay, thanks guys, I think this makes a bit more sense now.  Now I can go forth a bit more confident trying to master these newer versions of GLSL.

 

Shogun.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!