Jump to content

  • Log In with Google      Sign In   
  • Create Account

We're offering banner ads on our site from just $5!

1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Don't forget to read Tuesday's email newsletter for your chance to win a free copy of Construct 2!


Issue with OpenGL demo using Catalyst Drivers (Linux and Windows)


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
5 replies to this topic

#1 theWatchmen   Members   -  Reputation: 116

Like
0Likes
Like

Posted 22 July 2014 - 02:57 AM

Hi everyone,
I'm developing a small demo for behaviour trees and I'm having some issues when running it on AMD hardware.
I've tested my code with four configurations:

  1. MacBook Pro (NVidia GT650M) - fine
  2. Desktop with CentOS 6.5 (Nvidia Quadro FX) - fine
  3. Desktop with Windows 7 64 bit (AMD HD7950 with Catalyst 14.4) - slow
  4. Desktop with Fedora 19 (AMD HD7950 with catalyst 14.4) - slow 

3 and 4 are actually the same machine. The code is not highly optimized but it's not doing anything too complex
either: I have a grid (which I render using GL_POINTS), a line that represents the path found by A* and a moving
agent. The grid has about 10k elements, if I remove that the demo runs better, but still not perfectly.

I guess it's a driver issue, as on 3 and 4 it seems it's running with software rendering; I profiled the code on Windows
with CodeXL and a frame take ~400ms and seems to be using mostly the CPU rather than the GPU.

As final information, I'm using GLEW and GLFW for cross-platform development. The full code is available here:
https://bitbucket.org/theWatchmen/behaviour-trees

Thanks in advance for the help and let me know if you need any further information!

Marco



Sponsor:

#2 mhagain   Crossbones+   -  Reputation: 8140

Like
0Likes
Like

Posted 22 July 2014 - 03:25 AM

I'm not going to browse the full code to find this, but can you confirm what point size you're using?


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#3 theWatchmen   Members   -  Reputation: 116

Like
0Likes
Like

Posted 22 July 2014 - 04:07 AM

I'm currently using 7 as point size. The idea is to have points render at 1 pixel less than the

grid cell (currently 8) to render a proper grid.



#4 mhagain   Crossbones+   -  Reputation: 8140

Like
1Likes
Like

Posted 22 July 2014 - 10:01 AM

I'm currently using 7 as point size. The idea is to have points render at 1 pixel less than the

grid cell (currently 8) to render a proper grid.

 

OK; why I'm saying this is that there are certain things in OpenGL (OS is totally irrelevant here) where a GL_VERSION mandates support for a feature, but the hardware may not actually be able to support it.  In order to claim compatibility with that GL_VERSION, the driver must emulate it in software.

 

I'm suspecting that you're hitting one of those cretain things here, so the first thing I suggest is that you drop your point size to 1 and re-test.


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#5 theWatchmen   Members   -  Reputation: 116

Like
0Likes
Like

Posted 23 July 2014 - 04:37 AM

 

I'm currently using 7 as point size. The idea is to have points render at 1 pixel less than the

grid cell (currently 8) to render a proper grid.

 

OK; why I'm saying this is that there are certain things in OpenGL (OS is totally irrelevant here) where a GL_VERSION mandates support for a feature, but the hardware may not actually be able to support it.  In order to claim compatibility with that GL_VERSION, the driver must emulate it in software.

 

I'm suspecting that you're hitting one of those cretain things here, so the first thing I suggest is that you drop your point size to 1 and re-test.

 

 

Thanks, I will try this tonight and let you know how it goes!!



#6 theWatchmen   Members   -  Reputation: 116

Like
0Likes
Like

Posted 07 August 2014 - 03:57 AM

Apologies for the late reply, I went on holiday and didn't get a chance to test it

until now. If I change the point size to 1 it works normally, so it seems that for

this card GL_POINTS are implemented in software.

 

I will need to move to normal triangles to make sure it works everywhere without

issues.

 

Thanks a lot for your help!






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS