Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


weird shader error at run time.


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
4 replies to this topic

#1 Enerjak   Members   -  Reputation: 235

Like
0Likes
Like

Posted 24 May 2013 - 03:34 AM

Ok, for some reason I can't seem to compile my shader for openGL due to this:

 

 

http://puu.sh/303nE.jpg

 

now here is my vert / frag shader.

 

shader.vert

#version 150 core  
in vec3 in_Position;
  
in vec3 in_Color;  
out vec3 pass_Color;  
void main(void)
  
{
       
gl_Position = vec4(in_Position, 1.0);
       
pass_Color = in_Color;
  
}  

 

shader.frag

 

#version 150 core

in vec3 pass_Color;

out vec4 out_Color;

void main(void)
{
      out_Color = vec4(pass_Color, 1.0);
}

 

so, i don't know why this is not compiling good. Also, my GTX 450 NVIDIA card doesn't seem to support wglewCreateContext which is weird, is there anyway to be sure it doesn't support it? (I would think it would be common among cards of this day to support this:

 

if(wglewIsSupported("WGL_ARB_Create_Context") == 1)
		{
			this->m_GLRC = ::wglCreateContextAttribsARB(this->m_deviceContext, NULL, attributes);
			wglMakeCurrent(NULL,NULL);
			wglDeleteContext(m_GLRC);
			wglMakeCurrent(m_deviceContext, m_GLRC);
			MessageBoxA(NULL,"context created",NULL,MB_OK);
		}
	

 

 



Sponsor:

#2 radioteeth   Prime Members   -  Reputation: 1146

Like
0Likes
Like

Posted 24 May 2013 - 11:08 AM

try changing the "WGL_ARB_Create_Context" to "WGL_ARB_create_context".

 

I'm not sure why the shader isn't compiling. All I can think is that you're not transforming the vec3 against the projection/modelview matrices, or any matrices for that matter, but I wouldn't know if doing anything otherwise would be invalid - such as casting to a vec4.



#3 Enerjak   Members   -  Reputation: 235

Like
0Likes
Like

Posted 24 May 2013 - 02:44 PM

try changing the "WGL_ARB_Create_Context" to "WGL_ARB_create_context".

 

I'm not sure why the shader isn't compiling. All I can think is that you're not transforming the vec3 against the projection/modelview matrices, or any matrices for that matter, but I wouldn't know if doing anything otherwise would be invalid - such as casting to a vec4.

thanks, that worked, but created openGL version 1-1 for some reason? weird............



#4 radioteeth   Prime Members   -  Reputation: 1146

Like
0Likes
Like

Posted 25 May 2013 - 02:04 PM

Are your drivers correct ?



#5 marcClintDion   Members   -  Reputation: 431

Like
1Likes
Like

Posted 04 June 2013 - 03:45 AM

I ran your shaders through AMD's GPU ShaderAnalyzer and it verifies that your code is correct.  I may get mobbed for saying this, but even though the Khronos group recommends that you should always add #version to the shader program, I've noticed from testing everything I write on multiple GPU brands that sometimes the shader will fail even when the code is valid.  It seems that some drivers will reject the shader because it doesn't like the version number.  Try commenting it out and see what happens.


Consider it pure joy, my brothers and sisters, whenever you face trials of many kinds, because you know that the testing of your faith produces perseverance. Let perseverance finish its work so that you may be mature and complete, not lacking anything.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS