Jump to content

  • Log In with Google      Sign In   
  • Create Account


How to make sure your game works on multiple graphics cards


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
6 replies to this topic

#1 ic0de   Members   -  Reputation: 808

Like
0Likes
Like

Posted 02 November 2012 - 12:39 PM

I have my OpenGL game which runs excellent on Nvidia cards, has minor issues with ATI cards and fails hard on Intel cards, all of these cards support the correct OpenGLversion and all required extensions. I managed to test these things on friends computers but I only have an Nvidia card and to fix these issues I need access to the card for extended periods of time. I'm not going to buy another card so I was wondering if software existed to emulate the glitches and idiosyncrasies of certain cards so I can work around them? A list of known issues with ATI and Intel drivers would also go along way to make my game more compatible.

you know you program too much when you start ending sentences with semicolons;


Sponsor:

#2 sox   Crossbones+   -  Reputation: 484

Like
1Likes
Like

Posted 02 November 2012 - 03:29 PM

Nvidia tries to be developer-friendly... even if you do the wrong thing, the driver will try to fix it for you and give correct output. The real glitch you're dealing with may be that Nvidia doesn't fail where it "should".

I don't know of any system that lets you emulate an ATI card with an Nvidia card, but there are different hacks and cheats in place for different retail games. Try running your game with the app profile of some AAA titles (BattleField 3, etc). Some games might be profiled to minimize the use of driver "training wheels"... you could use that fact to reproduce your bug on Nvidia hardware, and figure out what you're doing wrong in code.

If all else fails, buy your friend some pizza and monopolize his system all day while he plays Xbox. That's what friends are for.

#3 Hodgman   Moderators   -  Reputation: 27563

Like
0Likes
Like

Posted 02 November 2012 - 08:37 PM

To really be sure, you're going to need a few different cards. Which GL version are you using? GL2.1 capable cards (from any maker) can be picked up for ~$10 on eBay these days.

Failing that, most compatability bugs are from invalid code -- as mentioned above, different vendors will tolerate different types of invalid code. e.g. nVidia accepts HLSL code in their GLSL shaders, and ATI accepts deprecated API usage that the standard says should produce errors, etc, etc...

You could get some tools to help with validation, e.g. use ATI ShaderAnalyzer to see if it spots any errors in your GLSL code, and use gDEbugger to check for any GL usage issues.

Lastly, you can re-read the GL specifications at the same time as inspecting your code with a fine-tooth-comb to check for usage errors that your nVidia driver is tolerating.

#4 Kaptein   Prime Members   -  Reputation: 1844

Like
0Likes
Like

Posted 03 November 2012 - 12:38 AM

put a #version 130 tag on top of the shader (or another more widely supported version, such as 120)
that will make nvidia drivers start following the GLSL spec

other than that, ATI can be a nightmare in compatibility mode
and Intel drivers is something every coder on this planet is trying hard to destroy with lazer eyes
personally i just pretend they dont exist :) but each to his/her own

#5 Aks9   Members   -  Reputation: 769

Like
0Likes
Like

Posted 03 November 2012 - 06:05 AM

put a #version 130 tag on top of the shader (or another more widely supported version, such as 120)
that will make nvidia drivers start following the GLSL specs/her own

Are you sure? I'm perfectly sure that NV has ignored #version till recently maybe.
NV allowed mixing even GLSL and Cg in the same code (please correct me if this is not the case any more).

So, in short, there is no way to know how the program will execute on the particular combination of hardware and driver unless you actually try it.

Yesterday I had such problem. Something that works on all NV, AMD and Intel cards I tried simple didn't work on client's machine with NV card. Reason: old drivers that don't handle creation of "new" context (GL3+) correctly. The problem is setting attributes that are not known in the time driver was made. Instead of just ignoring them, like the spec says, context is created correctly, but the application crashes when try to access it.

I know it is hard, but if one really wants to have a stable application he/she should check GL/GLSL version and GPU ID and has a different path for different combinations. Posted Image

#6 Hodgman   Moderators   -  Reputation: 27563

Like
1Likes
Like

Posted 03 November 2012 - 06:40 AM

I know it is hard, but if one really wants to have a stable application he/she should check GL/GLSL version and GPU ID and has a different path for different combinations

Or use GL on Apple and D3D on Windows --- situations where drivers are actually required to be compliant Posted Image Posted Image

#7 mhagain   Crossbones+   -  Reputation: 7418

Like
1Likes
Like

Posted 03 November 2012 - 09:48 AM

I'm not sure why Hodgman got a -1 above because his answer is a correct description of the situation. It's an unfortunate reality that in many cases your choice boils down to either finding a lowest common denominator that works on everything, or starting to implement driver-specific workarounds scattered throughout your code (that may be invalidated by a new driver revision in a few months time).

Yesterday I had such problem. Something that works on all NV, AMD and Intel cards I tried simple didn't work on client's machine with NV card. Reason: old drivers that don't handle creation of "new" context (GL3+) correctly. The problem is setting attributes that are not known in the time driver was made. Instead of just ignoring them, like the spec says, context is created correctly, but the application crashes when try to access it.


This exposes rule #1 - make it a requirement that your customers upgrade their drivers. Refuse to support them if they don't. Make that your policy and make sure that it's clearly stated up-front. That's a harsh stance to take, and may be easier said than done, but it's your time that we're talking about here and you need to decide which is the most productive use of your time - adding extra cool stuff to your program or chasing down wacky driver bugs for the one customer who refuses to update their drivers.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS