Sign in to follow this  
Surakin

OpenGL OpenGL Performance Regression after the NVIDIA 197.x.x.x driver series

Recommended Posts

This is a repost of this thread from the main NVIDIA graphic drivers forum (non-developer) :http://forums.nvidia.com/index.php?showtopic=187087&st=0&p=1154303&fromsearch=1&#entry1154303

Hello,

I'm in charge of the OpenGL renderer in my company's MMO (Regnum Online)

I write here because we've been having performance problems with drivers newer than 197.x in both Windows and Linux. (I'm talking about 60-70% FPS loss, from ~80 to ~30, from ~60 to ~15).

The game uses shaders extensively but even with those disabled, using fixed-pipeline mode, the problem stays. I've tracked down the bottleneck to glDrawRangeElements, it's almost inmediate on 19x but it takes way more time to complete on 2xx. As it takes more time with bigger geometry, I strongly suspect it's somehow related to Vertex/Index Buffers staying in AGP/system memory and not GPU memory. Another test I ran to check this theory is switching to 19x and disabling VBO's completely, and the performance was exactly similar to 2xx with VBOs enabled.

I've been googling all day to try to find another people with this problem and I found nothing. If you please give me some guidelines i'd be really grateful because i'm really, really lost with this problem.

Best regards,

Share this post


Link to post
Share on other sites
Perhaps it has to do with the number of indices and vertices in each gldrawrangeelements call. Perhaps it is the vertex and normal format. Perhaps you are using some odd format like GL_DOUBLE. Etc

Share this post


Link to post
Share on other sites
Quote:
Original post by karwosts
Have you considered to check out NVIDIA PerfSDK or NVIDIA glExpert? I haven't used it much, but it looks like it has some performance monitoring tools in there. Might be worth your while to take a look.


I don't know if it's me but those tools seem to be broken on every OS/architecture/driver combination I tried :(

Share this post


Link to post
Share on other sites
Quote:
Original post by V-man
Perhaps it has to do with the number of indices and vertices in each gldrawrangeelements call. Perhaps it is the vertex and normal format. Perhaps you are using some odd format like GL_DOUBLE. Etc


The same vertex format worked flawlessly on the old drivers, and still works perfectly on the Direct3D version of the game. And no GL_DOUBLEs are used.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627638
    • Total Posts
      2978327
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now