[Solved] OpenGl Shader Output

Started by
3 comments, last by Capoeirista 6 years, 9 months ago

Hey folks,

Seeing some odd behaviour with a very simple OpenGL shader I'm trying to get working.

Here's the vertex shader:


#version 400

////////////////////
// Input Variables
////////////////////

in vec3 inputPosition;
in vec3 inputColor;


////////////////////
// Output Variables
////////////////////

out vec3 outputColor;


////////////////////
// Uniform Variables
////////////////////

uniform mat4 worldMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;


////////////////////
// Vertex Shader
////////////////////

void main( void )
{
	// calculate the position of the vertex using world/view/projection matrices
	gl_Position = worldMatrix * vec4( inputPosition, 1.0f );
	gl_Position = viewMatrix * gl_Position;
	gl_Position = projectionMatrix * gl_Position;
	
	// store the input colour for the fragment shader
	outputColor = vec3( 1.0f, 1.0f, 1.0f );//inputColor;
}

And this is the fragment shader:


#version 400

////////////////////
// Input Variables
////////////////////

in vec3 inputColor;


////////////////////
// Output Variables
////////////////////

out vec4 outputColor;


////////////////////
// Fragment Shader
////////////////////

void main( void )
{
	outputColor = vec4( inputColor, 1.0f );//vec4( 1.0f, 1.0f, 1.0f, 1.0f);
}

If I try and use the 'inputColor' in the fragment shader everything is just rendered black... however if I explicitly set the colour (in the commented out code) everything renders correctly... so there's a breakdown in communication between my vertex and fragment shaders.

Is there anything obvious I'm missing in the shader code? I'm not seeing any compilation warnings from the C++ code side of things, and all of the world/view/projection matrices are being passed in correctly.

Thanks!

Advertisement

I had a quick look at the fragment shader using NSight, and while the vertex shader looks like it has valid input parameters, I couldn't see anything relating to color in the fragment shader.

I've attached a couple of screen-caps from NSight.

 

vertexInputs.png

fragmentInputs.png

Your 'out' variables in your vertex shaders must match the 'in' variables in your fragment shaders.

 

So basically, if you do that in your vs:


out vec3 outputColor;

outputColor = vec3( 1.0f, 1.0f, 1.0f );

In  your fs, the input must be called outputColor, not input.

But when you do this in your fs:


in vec3 inputColor;

the shader is expecting an output called inputColor in the vs, which you don't have (except as an input, but that does not match).

 

This is called interface matching, and is depicted here.

9 hours ago, Capoeirista said:

I had a quick look at the fragment shader using NSight, and while the vertex shader looks like it has valid input parameters, I couldn't see anything relating to color in the fragment shader.

 

I don't know NSight, but I would expect a GPU development tool to at least send a warning about input attributes not related to any output in any previous shader stage.

6 hours ago, _Silence_ said:

Your 'out' variables in your vertex shaders must match the 'in' variables in your fragment shaders.

Oh damn, thanks for that!

I assumed that OpenGL would map the 'out' variable from the vertex shader to the 'in' variable of the fragment shader - assuming the types matched; had no idea the actual variable names had to be the same. Thank you, this was driving me a bit insane!

This topic is closed to new replies.

Advertisement