Jump to content
  • Advertisement
too_many_stars

Shader Attribute Help

Recommended Posts

Hello everyone,

I am doing some tutorials from this site: http://www.opengl-tutorial.org/beginners-tutorials/tutorial-8-basic-shading/ and am having some issues with my normals and passing them onto the vertex shader to get some diffuse lighting to work.

Here's a simple triangle

Triangle::Triangle( const glm::vec3& a , const glm::vec3& b, const glm::vec3& c ):
		angle(), vert_handle( 0 ), vao_handle(0), normal_handle( 0 ), pos(){

	GLfloat verts[] = { a.x , a.y , a.z ,
						b.x , b.y , b.z , 
						c.x , c.y , c.z};

	GLfloat vert_normals[] = { 0.0f , 0.0f , 1.0f,
								0.0f , 0.0f , 1.0f,
								0.0f , 0.0f , 1.0f};

	glGenBuffers( 1 , &vert_handle );
	glBindBuffer( GL_ARRAY_BUFFER , vert_handle );
	glBufferData( GL_ARRAY_BUFFER , sizeof( verts ) , verts , GL_STATIC_DRAW );


	glGenBuffers( 1, &normal_handle );
	glBindBuffer( GL_ARRAY_BUFFER , normal_handle );
	glBufferData( GL_ARRAY_BUFFER , sizeof( vert_normals ) , vert_normals , GL_STATIC_DRAW );

	glGenVertexArrays( 1 , &vao_handle );
	glBindVertexArray( vao_handle );

	glBindBuffer( GL_ARRAY_BUFFER , vert_handle );
	glVertexAttribPointer( 0 , 3 , GL_FLOAT , GL_FALSE , 0 , (void*) NULL );

	glBindBuffer( GL_ARRAY_BUFFER , normal_handle );
	glVertexAttribPointer( 1 , 3 , GL_FLOAT , GL_FALSE , 0 , (void*) NULL );
}

Creation of my glsl program

glsl = GLSLProgram( new GLSL() );
	glsl->compileShaders( "shaders/vert_shader.txt", "shaders/frag_shader.txt" );
	
	glsl->addAttribute("vertex_position");
	glsl->addAttribute("vertex_normal");
	
	glsl->linkShaders();
	glsl->printActiveAttributes();

And the vertex shader

#version 400

in vec3 vertex_position;	//in local space
in vec3 vertex_normal; 		//in local space

out vec3 frag_color;

uniform mat4 mvp_matrix;		// model * view * projection matrix
uniform mat4 model_matrix;		//local to world space model matrix
uniform mat4 view_matrix;  		//camera matrix
uniform vec3 light_pos_world; 	//light position in world space 


void main(){
	
	
	vec3 world_vert_pos 	= (model_matrix * vec4( vertex_position , 1 )).xyz; 				//get the vertex in world position
	vec3 vert_pos_camera	= (view_matrix * vec4( vertex_position , 1 )).xyz; 					//get the vertex in camera space
	vec3 to_camera 			= vec3( 0 , 0 , 0 ) - vert_pos_camera; 								//vector from vertex in camera space to the eye ( which is located at the origin)
	vec3 light_pos_camera 	= (view_matrix * vec4( light_pos_world , 1 )).xyz; 					//get the light position in camera space
	vec3 light_dir_camera 	= light_pos_camera + to_camera;										
	vec3 normal_in_camera 	= (view_matrix * model_matrix * vec4( vertex_normal , 0 )).xyz;		//get the normal in camera space ( note that w = 0 to signify it's a direction not a position!)
	
	gl_Position = mvp_matrix * vec4( vertex_position , 1 ); 
}

My issue is that the glsl program is telling me that my triangle normals are not active. However, per my understanding of the documentation, since I am using "vertex_normal" in my vertex shader, that attribute should be active, but it's not.

Of course, without the normals, I can't get the light to work. Could someone please tell me what the issue is?

 

Thanks,

 

Mike

Share this post


Link to post
Share on other sites
Advertisement
48 minutes ago, too_many_stars said:

However, per my understanding of the documentation, since I am using "vertex_normal" in my vertex shader, that attribute should be active, but it's not.

You are referencing vertex_normal in your vertex shader, but you aren't actually doing anything with the result. The compiler has very helpfully noticed this, and optimised your shader by removing all that unnecessary code.

It should work fine if you start writing the data out to your fragment shader, and use it as part of calculating the fragment colour. 

Share this post


Link to post
Share on other sites

Thanks for the reply Swiftcoder. I did as you said, but all I get is a black triangle with still only the positional attribute.

Please note that in the vertex shader I have the normals coming in via "in vec3 vertex_normal"

I am using them in the calculation to figure out "normal_in_camera" on the second last line which is going to the fragment shader.

New vertex shader

in vec3 vertex_position;	//in local space
in vec3 vertex_normal; 		//in local space

out vec3 light_dir_camera;
out vec3 normal_in_camera;

uniform mat4 mvp_matrix;		// model * view * projection matrix
uniform mat4 model_matrix;		//local to world space model matrix
uniform mat4 view_matrix;  		//camera matrix
uniform vec3 light_pos_world; 	//light position in world space 


void main(){
	
	
	vec3 world_vert_pos 	= (model_matrix * vec4( vertex_position , 1 )).xyz; 				//get the vertex in world position
	vec3 vert_pos_camera	= (view_matrix * vec4( vertex_position , 1 )).xyz; 					//get the vertex in camera space
	vec3 to_camera 			= vec3( 0 , 0 , 0 ) - vert_pos_camera; 								//vector from vertex in camera space to the eye ( which is located at the origin)
	vec3 light_pos_camera 	= (view_matrix * vec4( light_pos_world , 1 )).xyz; 					//get the light position in camera space
	vec3 light_dir_camera 	= light_pos_camera + to_camera;										//no sure about this
	vec3 normal_in_camera 	= (view_matrix * model_matrix * vec4( vertex_normal , 0 )).xyz;		//get the normal in camera space ( note that w = 0 to signify it's a direction not a position!)
	
	gl_Position = mvp_matrix * vec4( vertex_position , 1 ); 
}

Fragment shader

in vec3 normal_in_camera;
in vec3 light_dir_camera;

out vec3 frag_color;

void main(){
	
	vec3 light_color = vec3( 1 , 1 , 1 );
	vec3 material_color = vec3( 0.2 , 0 , 0.2 );
	
	vec3 n = normalize( normal_in_camera );
	vec3 l = normalize( light_dir_camera );
	float angle = clamp( dot( n , l ) , 0 , 1 );

	frag_color = material_color * light_color * angle;
	
}

Mike

Share this post


Link to post
Share on other sites

You aren't actually assigning the output variable normal_in_camera in your vertex shader. You are declaring a new local variable with the same name, and assigning to that (which is immediately thrown away at the end of the function, and hence optimised out).

Are you printing out the shader compilation/linking log? The compiler ought to be warning about these things.

Share this post


Link to post
Share on other sites

Bah! You are absolutely correct, that was an embarrassing mistake. I am still trying to get used to all the variables and how information flows between the OpenGL program and the shader's themselves.

And yes, I am printing out the error logs but nothing showed up the assignment.

 

Thanks you again so much!

 

Mike

Share this post


Link to post
Share on other sites
13 minutes ago, too_many_stars said:

And yes, I am printing out the error logs but nothing showed up the assignment.

Sadly the GLSL compiler is implemented differently by each driver vendor, and some provide... less detailed error messages than others.

It may be worth adding Khronos' glsllangValidator to your workflow.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!