Jump to content
  • Advertisement
Sign in to follow this  
smallbrain

OpenGL [GLSL] Compilation problem

This topic is 3038 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi everyone! Im new here and would like to thank you in advance for wasting your time on reading about my stupid problem. I have been programming for quite some time now but only recently tried to get my feet wet with opengl/glsl and as you can imagine ran into a problem: I made a shader with lumina (http://lumina.sourceforge.net) and it compiles fine there besides some warnings about deprecated stuff but when i tried using it in my program i got this error: 0(168) : error C5508: the operator "&" is not supported by this profile Yes, the shader uses bitwise operations and also, yes, i was surprised it worked on my rather old hardware (gf6600). Now i would instantly put this off under get new card if it would not have worked in the shader editor so i started to investigate and took a look at luminas source. Turns out its doing pretty much what i am doing. Only real difference i am seeing is lumina loads glew - i do not (might do so later though. Atm i am just trying to get something to display). On a side note, i got this warning from compiling in lumina earlier: 0(146) : warning C7548: >> requires "#extension GL_EXT_gpu_shader4 : enable" before use which i fixed by adding #version 130 (#extension GL_EXT_gpu_shader4 : enable gave me an error). This seems to be also recognized when loading the shader in my own program since it produces the same deprecation warnings. Anyways, i investigated further, got nvidias cgc compiler and tried compiling my shader with -oglsl. Turns out what i am seeing happens when i am trying to compile the shader with -profile fp40 - with -profile gp4fp it compiles fine. I am pretty much out of ideas here. I could not find a way to make opengl force the profile used for compiling my shader (i am not even sure if that would make sense) and i could not locate any major difference between what lumina does to load the shader and what i do. So any pointers in any directions would be greatly appreciated :)

Share this post


Link to post
Share on other sites
Advertisement
To step outside the box for a moment - what are you using bitwise ops for?
GPUs are traditionally much better at floating point, so if there's an easy way to translate your logic into float-based-math your shader might run faster and be more compatible at the same time ;)

Share this post


Link to post
Share on other sites
Better post shader source code. That card doesn't support bit operations in hardware, so the error you're seeing is correct.

Share this post


Link to post
Share on other sites
Quote:
Original post by Hodgman
To step outside the box for a moment - what are you using bitwise ops for?
GPUs are traditionally much better at floating point, so if there's an easy way to translate your logic into float-based-math your shader might run faster and be more compatible at the same time ;)


I am playing around (or better trying to atm) with a somewhat dynamic interface using uniform uints to pass parameters. The bitwise operations are used to cram the most possible data into the ints. Yes, i absolutely have no clue how that will play out performance wise but thats exactly why i am trying to do it.

Quote:
Original post by HuntsMan
Better post shader source code. That card doesn't support bit operations in hardware, so the error you're seeing is correct.


Thats good to know. So i guess getting a new card will be unavoidable in the long run but it baffles me even more then how lumina gets this to run. About the shader source: Well, i dont think that will be necessary since bitwise operators are clearly the culprit and the code does nothing fancy yet anyways (it is basically a multitexture version of some fixed functionality) so it sure could be rewritten differently but that does not help me much.

On a side note: I just realized the author of lumina forgot(?) to output the infolog after linking in one case (likely mine...). I will recompile with the infolog output in place to see if this solves the mystery of lumina compiling my shader but if thats the answer what on earth was it running all the time? It responded to my changes and all. I am a very confused person right now.

Edit: Just recompiled lumina with the infolog output for glLinkProgramARB and voila... here comes the load of errors. My guess is i broke it when i moved from plain uints to uniforms (deprived the compiler of the possibility to precalculate the bit operations) but lumina suppressed the errors and "somehow" kept running "something" that looked right. Sorry for wasting your time guys. I really appreciate your effort. And for me its GPU shopping time now :D

[Edited by - smallbrain on February 26, 2010 7:01:41 PM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!