GL version, new ATI card

Started by
15 comments, last by zedz 14 years, 10 months ago
I just bought a Radeon HD 4870, installed the latest driver (Catalyst 9.5), and the GL version returned from glGetString( GL_VERSION ) is 1.1! I might be wrong, but isn't the version returned from glGetString based on the card driver? I'm using SDL and GLEW in VC++2008. My system is AMD/ATI, on XP SP3 (32-bit). My old system had a 7900GT and reported the GL version correctly, and ran my app smoothly. All searches for this problem have found issues with ATI / GL 3.0 and 64-bit systems, but everyone says that GL 2.1 works fine. Is it a driver issue, or am I forgetting something? This is the output I get: + SDL Video Driver (windib) + OpenGL version: (1.1.0) + OpenGL vendor: (Microsoft Corporation) + OpenGL renderer: (GDI Generic) + GLEW version: 1.5.1 This post's OP has issue an isue creating a 3.0 context, but gets a 2.1 context just fine. This guy has a laptop card and solved the issue by using a third-party driver, but his source only has drivers for ATI mobility cards. This geeks3d.com post says there's an issue with 3.0 contexts, but no issue with 2.1!
Advertisement
SDL has a horrible tendency to pick up Microsoft's software driver instead of the vendor specific drivers. I would recommend using a different windowing toolkit, such as GLFW.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Thank you for the suggestion, but I can't switch from SDL at this point in development. I'm considering GLFW for my next project.

I checked the SDL mailing lists and found someone else with the same issue. I thought there should be an easy solution, and there is! For anyone with the same problem, just remove the hardware acceleration line for ATI cards: SDL_GL_SetAttribute( SDL_GL_ACCELERATED_VISUAL, 1 );
This line is supposed to guarantee hardware acceleration, but for whatever reason it returns the windows software driver for ATI cards. Removing it solved the problem for me, but I'll need to test the same build on my Nvidia card to see if hardware acceleration is sacrificed. Also, there is a question of whether shaders will use hardware acceleration without the call. I'll post back with the results.

The thread on the SDL mailing lists.
Well it seems leaving the acceleration line commented-out works fine on XP for Nvidia and ATI cards. My app reports the correct hardware vendor for both. I don't know if OSX or linux builds should keep the line, though.
As far as osx (10.5) I've never even used that line, and I always get an accelerated context, so I don't think you have to worry about that.
Quote:Original post by sneakyrobot
Well it seems leaving the acceleration line commented-out works fine on XP for Nvidia and ATI cards. My app reports the correct hardware vendor for both. I don't know if OSX or linux builds should keep the line, though.
Pretty sure that particular switch is completely useless.
Quote:Original post by Riraito
As far as osx (10.5) I've never even used that line, and I always get an accelerated context, so I don't think you have to worry about that.
I don't think SDL can give you an unaccelerated OpenGL context under OS X - lord knows I tried hard enough, when I needed the software renderer for debugging.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Quote:
As far as osx (10.5) I've never even used that line, and I always get an accelerated context, so I don't think you have to worry about that.
I don't think SDL can give you an unaccelerated OpenGL context under OS X - lord knows I tried hard enough, when I needed the software renderer for debugging.
if u hide the opengl driver (rename it) + stick a software driver in the path eg mesa, then that will be used

btw Ive never used SDL_GL_ACCELERATED_VISUAL + all my stuff has worked fine on various PCs
I just had a look at what it does, + I can see where it can cause a problem, if something with the pixelformat is not 100% it appears to bail completely (giving u software), Ideally u should handle this case yourself (i.e. choose another pixelformat) but personally I wouldnt worry about it
Quote:Original post by zedz
Quote:
Quote:As far as osx (10.5) I've never even used that line, and I always get an accelerated context, so I don't think you have to worry about that.
I don't think SDL can give you an unaccelerated OpenGL context under OS X - lord knows I tried hard enough, when I needed the software renderer for debugging.
if u hide the opengl driver (rename it) + stick a software driver in the path eg mesa, then that will be used
Why would I want to do that? Apple's own software renderer is 100x more capable than MESA, and generally faster as well. I have always wondered why none of these windowing toolkits expose a setting to select it.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Ive no idea how apples software gl version performs, yes perhaps it is faster
but a reason to use mesa
A/ same on apple/ms/linux
B/ open source
Quote:Original post by zedz
Ive no idea how apples software gl version performs, yes perhaps it is faster.
Only marginally faster, from what I have seen, but it does tend to support a wider set of extension that MESA, and is always installed on all Macs, so why replace it?
Quote:A/ same on apple/ms/linux
OpenGL is already a cross-platform API - I don't see a point to using a cross-patform implementation.
Quote:B/ open source
That is only an advantage if you plan to change the source, and I have no intention of rewriting an OpenGL implementation - that is why I use a 3D API in the first place.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

This topic is closed to new replies.

Advertisement