Jump to content
  • Advertisement
Sign in to follow this  
MGB

OpenGL ATI & OpenGL

This topic is 5411 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all, I've been developing my game fine for ages using a Geforce4 and OpenGL, but recently got a Radeon9800 pro. Thing is, developing with the Ati card is just plain bad - my fog doesn't work properly on most of the drivers (despite changing/omitting the fog hint) and windows seems to get left in a horrible slow scrolling mode when debugging the code. Does anyone have a recommended Ati driver set they develop with that won't experience these problems? (or ways of fixing the problems... I'd hate to think users would need a certain driver set to experience proper fog!!)

Share this post


Link to post
Share on other sites
Advertisement
A lot of the time it's not the case that things don't work properly on ATI hardware. The problem is usually that code is written so it takes advantage of driver specific default settings. ATI's drivers are much less forgiving when code does not specifically set needed states and other parameters.

Share this post


Link to post
Share on other sites
I would actually say the opposite. ATI drivers are too forgiving when it comes to programmer's errors. For example you can leave Vertex/normal/texture arrays enabled after using them, and assuming you have your actual textures turned off after rendering, everything will be fine. NVidia videocards on the other hand crash here -- they require arrays to be disabled if they aren't used regardless of whether the textures are enabled or not. There were some other times when everything was working perfectly on my 9800 Pro, and had various problems on NVidia... When it all comes down to it, it always seems to work better on the videocard you're coding on. ;)

Share this post


Link to post
Share on other sites
Quote:
Original post by MichaelMook
When it all comes down to it, it always seems to work better on the videocard you're coding on. ;)


lol - yes I suppose so (fixing problems as they happen)

I don't have crashes or anything nasty, but I just can't use the debugger in the horrible screen mode it gets left in :(

Share this post


Link to post
Share on other sites
Quote:
Original post by Aph3x
I don't have crashes or anything nasty, but I just can't use the debugger in the horrible screen mode it gets left in :(

Horray for multi-monitor debugging [wink].

Share this post


Link to post
Share on other sites
Quote:
Original post by MichaelMook
I would actually say the opposite. ATI drivers are too forgiving when it comes to programmer's errors. For example you can leave Vertex/normal/texture arrays enabled after using them, and assuming you have your actual textures turned off after rendering, everything will be fine. NVidia videocards on the other hand crash here -- they require arrays to be disabled if they aren't used regardless of whether the textures are enabled or not. There were some other times when everything was working perfectly on my 9800 Pro, and had various problems on NVidia... When it all comes down to it, it always seems to work better on the videocard you're coding on. ;)

Are you saying that nVidia hardware will crash if you don't disable the client states before a glBegin()/glEnd() block? I've never experienced that on any GF/GF2/GF3/GF4. For that matter I don't ever recall a necessity to disable any states simply because the the rendering of a frame or a vertex array was completed. Maybe I'm just not understanding your comment.

Share this post


Link to post
Share on other sites
Quote:
Neo at elYsiun.com (Blender) Forum
Ok, I don't know if you know what you're doing so I'll try to be as simple as possible icon_wink.gif
1-download the catalyst version you want (3.9 works for me)
2-extract the contents of the file (you can make as if you wanted to install the driver but just after the extraction of the files by the installation programm cancel and tou will find the files in C:\ATI\SUPPORT\wxp-w2k-7-991-040224m-013831c\2KXP_INF\B_14006 )
3-the file you want is here, it is atioglxx.dl_ but it is a compressed .dll file
4-in order to decompress it, copy il somewhere, for exemple in C:5-open a command line window, type CD C:6-then type expand atioglxx.dl_ atioglxx.dll
7-finally, move the .dll file in blender main directory
that's all ! icon_biggrin.gif
easy ? isn't it ?


This should work. You can try the 4.3, 4.5, 3.9 drivers.

Share this post


Link to post
Share on other sites
Quote:
Original post by Schmedly
Quote:
Original post by MichaelMook
I would actually say the opposite. ATI drivers are too forgiving when it comes to programmer's errors. For example you can leave Vertex/normal/texture arrays enabled after using them, and assuming you have your actual textures turned off after rendering, everything will be fine. NVidia videocards on the other hand crash here -- they require arrays to be disabled if they aren't used regardless of whether the textures are enabled or not. There were some other times when everything was working perfectly on my 9800 Pro, and had various problems on NVidia... When it all comes down to it, it always seems to work better on the videocard you're coding on. ;)

Are you saying that nVidia hardware will crash if you don't disable the client states before a glBegin()/glEnd() block? I've never experienced that on any GF/GF2/GF3/GF4. For that matter I don't ever recall a necessity to disable any states simply because the the rendering of a frame or a vertex array was completed. Maybe I'm just not understanding your comment.

Not glBegin()/glEnd(), but glDrawElements(). If you had 4 texture units enabled in the first part of your program and forgot to turn them off before the second part which only uses two, then GeForces will crash, while ATI videocards just keep going as if nothing is wrong. ;)

Share this post


Link to post
Share on other sites
Quote:
Original post by MichaelMook
Quote:
Original post by Schmedly
Quote:
Original post by MichaelMook
I would actually say the opposite. ATI drivers are too forgiving when it comes to programmer's errors. For example you can leave Vertex/normal/texture arrays enabled after using them, and assuming you have your actual textures turned off after rendering, everything will be fine. NVidia videocards on the other hand crash here -- they require arrays to be disabled if they aren't used regardless of whether the textures are enabled or not. There were some other times when everything was working perfectly on my 9800 Pro, and had various problems on NVidia... When it all comes down to it, it always seems to work better on the videocard you're coding on. ;)

Are you saying that nVidia hardware will crash if you don't disable the client states before a glBegin()/glEnd() block? I've never experienced that on any GF/GF2/GF3/GF4. For that matter I don't ever recall a necessity to disable any states simply because the the rendering of a frame or a vertex array was completed. Maybe I'm just not understanding your comment.

Not glBegin()/glEnd(), but glDrawElements(). If you had 4 texture units enabled in the first part of your program and forgot to turn them off before the second part which only uses two, then GeForces will crash, while ATI videocards just keep going as if nothing is wrong. ;)



Ok but isn't that what is supposed to happen, the driver sees the unit enabled and it has a texture array bound to it. Why would it crash? Maybe i'm misunderstanding you.

btw if you forget to bind a texture array to an enabled unit (drawing with arrays) the ATi drivers will crash.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!