Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Drawing on different graphiccards


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Kraecker   Members   -  Reputation: 501

Like
0Likes
Like

Posted 29 April 2013 - 08:21 AM

Hi there,

 

i'm currently working on a terrain renderer.

i ran into some performance issues on my laptops intergrated intel graphics card (where terrain is rendered without problems), so i switched to the also available ATI card (where the terrain geometry completely disappears).

Trying to debug my exe with gDebugger, the context tells me there are 61k triangles drawn, but i don't see any.

 

Shaders compile & link successfully.

The VBO (~460kb) & IBO (~57kb) on the card seem to be correct.

 

Posting Code here is difficult since the whole project is a bit larger.

(Link to Github Project: https://github.com/Cracky/XiyaGE/tree/master/src)

 

What else information should/can I provide, to help you, to help me ? ;)

 

Why does the geometry disappear / What can cause these problems / Why does the code work on intels card but not on ATIs ?


Edited by Kraecker, 29 April 2013 - 08:30 AM.


Sponsor:

#2 slicer4ever   Crossbones+   -  Reputation: 3948

Like
0Likes
Like

Posted 29 April 2013 - 08:31 AM

Why does the geometry disappear / What can cause these problems / Why does the code work on intels card but not on ATIs ?

I haven't looked over your code, but i'll address the question.

 

basically, each vendor puts out their graphics drivers, these drivers might be strict, or less stricter than other drivers.  they might be fine with what the docs describe as "undefined behavior".  it comes down to the problem of how much the vendor cares to support and spend time on their openGL implementation, as you'll find directX is generally more reliable across different vendors.  IIRC some nvidia drivers don't even mind if you handed your openGL context directX shader code.  it's defiantly a problem that your program may run differently on different chipsets(particularly integrated intel being generally crap). however things are getting better, but their's still plenty of older hardware out their that might have a problem with how you are trying to do things.

 

Also, i'd recommend checking that the ATI card supports the openGL version you are using, it's most probable it does, but just be certain, also make sure you have the latest drivers for the card.


Edited by slicer4ever, 29 April 2013 - 08:31 AM.

Check out https://www.facebook.com/LiquidGames for some great games made by me on the Playstation Mobile market.

#3 Kraecker   Members   -  Reputation: 501

Like
0Likes
Like

Posted 29 April 2013 - 08:35 AM

the OpenGL version I am using is 2.1 with GLSL 1.2  / the ATI card supports up to 3.3 so it should be backwards compatible, right?
 
 
--edit--
not all of my geometry disappears - just the terrain
the animated models are still drawn


--edit2--

i think this is heading in the wrong direction...

i know that different drivers handle things different.
the question i actually wanted to ask: how would i start to debug this issue?

Edited by Kraecker, 30 April 2013 - 12:58 AM.


#4 VladR   Members   -  Reputation: 722

Like
1Likes
Like

Posted 30 April 2013 - 02:29 PM

the question i actually wanted to ask: how would i start to debug this issue?

This is an order of magnitude easier task under  DirectX - since when you run the game in Visual Studio you can see the Debug output being printed into the Output Pane where you will see lots of info why the rendering is broken. Even in XNA, you would get an exception, if your indices were broken, or there was a mismatch about some obscure, low-level stuff.

 

You chose OpenGL, so you just have to suck it up now and check each call manually if there is some error smile.png

 

Of course, since we are talking about ATI, it is expected that they will not return the error code at all times - you wouldn't expect ATI drivers to be as good as nVidia drivers now, would you smile.png

 

I've seen this behaviour on multiple ATI cards under multiple sets of official drivers. Basically, the renderer worked flawlesly on everything, Intel GMA950 not excluding, just the ATI cards were unpredictable. (and that was taking into account all info there was on their issues).

 

 

If I were you, I would just display a message along the lines of "Warning ! ATI card detected. Please insert a reliable gfx card." smile.png

 

As you probably guessed, ATI cards gave me a hell, some time ago...

 

 

BTW, if I had to guess, I'd check if that card of yours supports 32-bit Indices first (Yes, you'd be surprised. No, just because the gfx caps tell you it is supposed to support 32bit Indices , it does not , necessarily, have to. You are welcome smile.png ).

 

Do you have an access to the machine ? Can you step through the code, line by line, and see where it crashes ? Most probably, some resource (VB, IB, texture, RT, ...) didn't get created for whatever reason - majority of them, of course, unjustifiable - based on gfx Caps the drivers expose...


VladR    My 3rd person action RPG on GreenLight:    http://steamcommunity.com/sharedfiles/filedetails/?id=92951596

 





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS