Jump to content
Posted 02 November 2012 - 12:39 PM
you know you program too much when you start ending sentences with semicolons;
Posted 02 November 2012 - 03:29 PM
Posted 02 November 2012 - 08:37 PM
Posted 03 November 2012 - 12:38 AM
Posted 03 November 2012 - 06:05 AM
Are you sure? I'm perfectly sure that NV has ignored #version till recently maybe.
put a #version 130 tag on top of the shader (or another more widely supported version, such as 120)
that will make nvidia drivers start following the GLSL specs/her own
Posted 03 November 2012 - 06:40 AM
Or use GL on Apple and D3D on Windows --- situations where drivers are actually required to be compliant
I know it is hard, but if one really wants to have a stable application he/she should check GL/GLSL version and GPU ID and has a different path for different combinations
Posted 03 November 2012 - 09:48 AM
Yesterday I had such problem. Something that works on all NV, AMD and Intel cards I tried simple didn't work on client's machine with NV card. Reason: old drivers that don't handle creation of "new" context (GL3+) correctly. The problem is setting attributes that are not known in the time driver was made. Instead of just ignoring them, like the spec says, context is created correctly, but the application crashes when try to access it.
It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.