GLSL Working on Nvidia & not on AMD

Started by
15 comments, last by Ashaman73 9 years, 1 month ago

Pretty sure AMD's GLSL compiler will complain if you put 'f' at the end of float literals (because its not valid GLSL).

GLSL Version 1.3 supports f postfix.

Nevertheless, I would first check the following:

1. Driver ! Has your friend installed the newest driver. Many bugs and glitches I encountered on other PCs were solved by installing a new driver.

2. Hybrid GPU: double check if your friend is not using an integrated GPU. Automatic detection of using the dedicated GPU does not work very good with OpenGL.

3. Maybe it is not the shader which breaks the application, it could be any OGL related feature. Check the OGL error state frequently in your code (use defines or whatever to turn it off in release mode, otherwise it could kill your performance).

Advertisement

Update on the situation: I have made a small sample program with a small basic shader to test the problem and am running into the same issue.

Here is what the project renders as on my Nvidia GPU:fkz5jEU.png

And here is what in renders as on AMD GPUs:uRC2lFd.png

Here is the test project so you all can run it yourselves and look through the code to see what is up... I am beyond confused..


And here is what in renders as on AMD GPUs

I'm not sure that has anything to do with your shader code.

It looks more like you are using GL_TRIANGLE_STRIP with the vertices in the wrong order for the second one.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Are you trying to use GL_QUADS ?


And here is what in renders as on AMD GPUs

I'm not sure that has anything to do with your shader code.

It looks more like you are using GL_TRIANGLE_STRIP with the vertices in the wrong order for the second one.

The problem is those two screenshots are taken from the exact same compiled program just running on two different brands of GPUs

Are you trying to use GL_QUADS ?

No, I am using GL_TRIANGLE_STRIP.

Was skimming through these forums and noticed this thread: http://www.gamedev.net/topic/666609-nvidia-glsl-and-vertex-attribute-locations/

Turns out I was linking the program before binding the attribute locations. I am so happy to solve this issue thanks for your suggestions guys!

I have heard AMD is stricter than Nvidia on sticking to the specifications

By phrasing it this way you are doing exactly what NVIDIA wants: Customers giving AMD a bad rap.
You should pick your words more carefully/accurately—AMD isn’t more strict, NVIDIA is more lax. For people to shine a negative light on AMD with phrasing such as, “They are more strict,” is exactly what NVIDIA wants, never mind how disrespectful it is towards us the consumers who are being played as NVIDIA’s puppets in this manner.

Most vendors, including AMD, stick closely to the standard because it helps customers and developers. Being lax as NVIDIA is is absolutely hurtful to customers and provides no benefits. Everyone has to make something that can run on AMD anyway, so all the whistles and bells NVIDIA adds never get used anyway, and meanwhile just confuse developers who end up having to waste time going back late-stage (rather than catching bugs early, which makes them easier to solve) to fix these and other kinds of issues.

So when comparing AMD’s and NVIDIA’s OpenGL support, it should always be stated in such a way as to cast NVIDIA in the more negative light, when applicable.


L. Spiro

I restore Nintendo 64 video-game OST’s into HD! https://www.youtube.com/channel/UCCtX_wedtZ5BoyQBXEhnVZw/playlists?view=1&sort=lad&flow=grid


Customers giving AMD a bad rap.

I've never blamed ATI/AMD or Intel for the strict handling of GLSL syntax. For me, the negative reputation of ATI was a result of years of bad driver support. As hobby dev (aka a developer without hope of any direct support ) I just hated it to not beeing able to implement what I wanted, because the driver was not handling it correctly and there were no indication to when the bug will be fixed. Once burnt, twice shy, after several years I go over to just buy NVidia cards as development GPUs, even if the driver qualtiy is better and might be better than other vendors now.

In my mindset the reputation of AMD increases slowly, mantle being a strong argument, but ATI themself was responsible for a large portion of their bad reputation and a bad reputation will linger for quite some time.

This topic is closed to new replies.

Advertisement