Jump to content
  • Advertisement

theWatchmen

Member
  • Content Count

    6
  • Joined

  • Last visited

Community Reputation

122 Neutral

About theWatchmen

  • Rank
    Newbie
  1. theWatchmen

    AltDevBlog is back!

      The original maintainer was Mike Acton, and we tried to get in touch with him to get an official backup copy without success. The original blog had articles from different authors and we are re-hosting the same content. We are also re-building the authors community and there will be new content soon!     Working on it :)
  2. theWatchmen

    AltDevBlog is back!

    Hello everyone,  you might be already aware of this, but AltDevBlogADay is back at altdevblog.com!   It's a blog where professionals who work in the gaming or related industries post about programming, audio, art and animation or community.   The original website went down about a year ago, and we decided to bring it back to life. So far we uploaded only old content, but there are already new posts boiling in the cauldron   Feel free to have a look around and, if you are interested in contributing, please have a read here: getting started.   See you on the other side!
  3. Apologies for the late reply, I went on holiday and didn't get a chance to test it until now. If I change the point size to 1 it works normally, so it seems that for this card GL_POINTS are implemented in software.   I will need to move to normal triangles to make sure it works everywhere without issues.   Thanks a lot for your help!
  4.   OK; why I'm saying this is that there are certain things in OpenGL (OS is totally irrelevant here) where a GL_VERSION mandates support for a feature, but the hardware may not actually be able to support it.  In order to claim compatibility with that GL_VERSION, the driver must emulate it in software.   I'm suspecting that you're hitting one of those cretain things here, so the first thing I suggest is that you drop your point size to 1 and re-test.     Thanks, I will try this tonight and let you know how it goes!!
  5. I'm currently using 7 as point size. The idea is to have points render at 1 pixel less than the grid cell (currently 8) to render a proper grid.
  6. Hi everyone, I'm developing a small demo for behaviour trees and I'm having some issues when running it on AMD hardware. I've tested my code with four configurations: MacBook Pro (NVidia GT650M) - fine Desktop with CentOS 6.5 (Nvidia Quadro FX) - fine Desktop with Windows 7 64 bit (AMD HD7950 with Catalyst 14.4) - slow Desktop with Fedora 19 (AMD HD7950 with catalyst 14.4) - slow  3 and 4 are actually the same machine. The code is not highly optimized but it's not doing anything too complex either: I have a grid (which I render using GL_POINTS), a line that represents the path found by A* and a moving agent. The grid has about 10k elements, if I remove that the demo runs better, but still not perfectly. I guess it's a driver issue, as on 3 and 4 it seems it's running with software rendering; I profiled the code on Windows with CodeXL and a frame take ~400ms and seems to be using mostly the CPU rather than the GPU. As final information, I'm using GLEW and GLFW for cross-platform development. The full code is available here: https://bitbucket.org/theWatchmen/behaviour-trees Thanks in advance for the help and let me know if you need any further information! Marco
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!