Intel crash

Started by
4 comments, last by Hodgman 9 years, 7 months ago

Hi, I have a glsl shader (for deferred lighting), and I uses cubemaps for omnidirectional lights. Everything works fine on my NVIDIA gpu. but when I try to compile with an intel gpu, it crashes during the link.

If I remove that line:


visibility += texture( LightCubeMap[lightID], vec4(dirOffset, depth) );

Then there is no crash.

Is there any known problem with samplerCubeShadow in glsl version 330 compiled on a intel GPU ? I can't find anything on google. Or maybe my problem is something else ?

Thank you.

Advertisement

Ok forget this, I updated my intel gpu driver and now there is no crash. But things are displayed differently. Do you know any website reference that explains what are the difference between intel and Nvidia glsl interpretation to find what can causes the differences ? Thank you


. But things are displayed differently.


Thats the nature of the OpenGL beast. Each IHV have there interpretation of the specification, some more lenient than others. With that said, code/shaders that work on one implementation sometime will not run/compile on another implementation. There is really no document that outlines the difference between implementation except for what extensions/features they do/do not support. What exactly is being displayed differently ? If you have pinpointed a particular section of the shader that contribute to the difference, then it would be possible to give a few suggestion. I recently tried to implement deferred lighting with GLES and tried to use an encoded framebuffer to maintain some dynamic range, tested on a Adreno, Mali and PowerVR GPU, and they all gave slightly different results. Some worse than other, took a while to track this issue down, but in the end the issue was caused by not enough precision on certain implementation. What exact GPU are you using for Intel and Nvidia?

NVIDIA GeForce GT 630M (laptop) :

869174img1.png

Intel HD Graphics 4000 (laptop) :

438006img2.png

Are you by any chance uploading a 3x3 matrix when you should be uploading a 4x4 matrix?

The reason I ask is because the triangular 'ish' lit part of the shadow on the second image (slightly right of top-left) matches the first image's bottom-left section below the table. If you rotate the shadow volume by roughly 45 degrees on the second image it 'kinda' fits.

You can try passing your shader code through the refence compiler - it's supposed to be the gold standard for whether your GLSL code is valid or not.

This topic is closed to new replies.

Advertisement