GLSL on intel cards

Started by
9 comments, last by MarkS_ 12 years, 4 months ago
Hi, I have this game I wrote with OpenGL and it runs perfectly on my Desktop's Nvidia GTX 460 but screws up on my laptop which I know supports OpenGL 2.0. On my laptop when using shaders it looks as if the color depth is really low like 4 bits yet I am running it at 32 bits. Without shaders however it looks as if its in full color. Every other aspect of the game works. Are there GLSL functions that don't work on Intel cards? Or could it be that i just need to update the driver?
Advertisement
Does your laptop have Intel integrated graphics? If so, it does not support 2.0. That is unless Intel has finally updated the drivers after (nearly?) a decade.

Does your laptop have Intel integrated graphics? If so, it does not support 2.0. That is unless Intel has finally updated the drivers after (nearly?) a decade.


It does have Intel integrated graphics but glew reported it as supporting 2.0


OK, it would seem that I am mistaken. There are Intel IGP that lack 2.0 support, but newer chips have 3.0 support and above. I think what really matters is what chipset is in your laptop.

This seems like a good place to start looking: http://www.intel.com/support/graphics/sb/cs-010479.htm

OK, it would seem that I am mistaken. There are Intel IGP that lack 2.0 support, but newer chips have 3.0 support and above. I think what really matters is what chipset is in your laptop.

This seems like a good place to start looking: http://www.intel.com...b/cs-010479.htm


from Intel:
[font="Arial"]
"The integrated graphics controller of the Intel® G45/G43/G41/Q43/Q45 Express Chipsets and Mobile Intel® 4 Series Express Chipset Family supports hardware acceleration for OpenGL* applications in 16-bit and 32-bit color depths. The latest Intel® Graphics Media Accelerator drivers provide support for the OpenGL version 2.0. "

My chipset is Mobile Intel® 4 Series Express
[/font]
I am 100% sure it supports 2.0, the shader still runs just not well, I read somewhere that Intel cards screw up when using the pow() function, is there any alternative to using pow();

Does your laptop have Intel integrated graphics? If so, it does not support 2.0. That is unless Intel has finally updated the drivers after (nearly?) a decade.


Intel are up to a fairly good chunk of OpenGL 3.0 right now, and have supported OpenGL 2.1 for many years on compatible hardware. Your assertion is false.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.


Intel are up to a fairly good chunk of OpenGL 3.0 right now, and have supported OpenGL 2.1 for many years on compatible hardware. Your assertion is false.


huh.gif


OK, it would seem that I am mistaken. There are Intel IGP that lack 2.0 support, but newer chips have 3.0 support and above. I think what really matters is what chipset is in your laptop.


Do they have support for OQ in their latest GL 3 cards?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Now we know it is definitely supported but what do i need to do to my shader to get it to run properly on thee cards

Now we know it is definitely supported but what do i need to do to my shader to get it to run properly on thee cards

Intel's shader implementation is far, far stricter than NVidia's. NVidia allows all sorts of non-standard crap, so in all likelihood your shaders are wrong, despite running fine on an NVidia card. I would recommend finding an AMD/ATI card to test on as well - they tend to be similarly strict about the OpenGL standards.

If that fails, try sacrificing a goat - my experience of these Intel parts are that even the supported OpenGL versions are riddled with interesting bugs and quirks.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

This topic is closed to new replies.

Advertisement